Right way to send batches to Baserow?

Describe the issue/error/question

Hello fellas.
I was wondering, what the right way is to create batch records to baserow. It seems obvious that the Set node + Aggregate Item + Split In batches would be the key but I can’t get it to work properly.

How should I approach it?

I have used the Baserow node, however, workflow execution time crashes on record # 912, more less. And I need to update a couple of thousands records at a time. Thanks!

Please share the workflow

Share the output returned by the last node

{“error”:“ERROR_REQUEST_BODY_VALIDATION”,“detail”:{“items”:[{“error”:“Ensure this field has no more than 200 elements.”,“code”:“max_length”}]}}",

Information on your n8n setup

  • n8n version: 0.191.1
  • Database you’re using (default: SQLite): yup
  • Running n8n with the execution process [own(default), main]: own
  • Running n8n via [Docker, npm, n8n.cloud, desktop app]: docker on vultr

Hi @th3liam, if your workflow execution crashes when processing a large number of items there’s a good chance your n8n instance ran out of memory. You might want to check the server logs to be sure about this.

To avoid this you’d typically want to use a sub-workflow. In your parent workflow, use the SplitInBatches node to split your execution into smaller chunks, then send these smaller chunks to a sub-workflow using the Execute Workflow node. Run the Baserow operation (this should work using the regular Baserow node) and finally use a Set node executing only once to return only a single item like { success: true } to your parent workflow.

What now happens is that the memory used to execute your sub-workflow becomes available again for the next batch. This should help dealing with larger datasets on instances with limited memory.

1 Like