Slow baserow level on n8n

Hello everyone? Are you all doing well?

I don’t understand why data insertion in baserow takes so long when inserting several lines of data (6400 lines). Is this normal or are there problems?

Configurations in the n8n server and baserow together:

  • 32 GB RAM / 4 cores
  • Virtual machine
  • +2 Ghz Frequency
  • Separate docker environment

Can you help me with my datasets and updates because with this slowness, it’s too long to work.

Thanks a lot

Information on your n8n setup

  • n8n 0.225.2
  • Database using:Baserow
  • Running n8n with the execution process [own(default), main]:own
  • Running n8n via [Docker, npm,, desktop app]:Docker

Hi @Micka_Rakotomalala

That is because u are sending 6400 seperate requests.

Baserow does also allow batch requests, so you can try using those.:slight_smile:

1 Like

Thank you very much for your reply.
In my case, I compared the slowness of the insertion with the baserow node and the api.
It’s the API that’s faster, to avoid node timeout, I had to separate the insertions, but despite that, it’s about 30 minutes to insert this data.

So with the insertion batches, I’ve set up as shown in the figure.
Capture d’écran du 2023-07-07 15-49-09

Could you please describe the solution in a little more detail?


Ah sorry, you are missing 1 key piece of information about n8n nodes. :slight_smile:
Most nodes run per item that go through them. So for baserow and also the http node it will run per record. So it is doing 6400 api requests in this case. The batch options allow you to let it wait for x time after x number of requests, but still it processes every record one by one.

The baserow API also has a bulk option which allows you to send arrays of records to be processed.
This is not yet implemented in the baserow node, so you need to do it manually by grouping the records and then sending a request with the http request node.


This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.