How to Retrieve Data in Batches From Baserow

Describe the problem/error/question

I have a large dataset in Baserow, and I need to move the content of some column to other columns.

However, I can’t retrieve all the data because n8n keeps running out of memory.

I want to retrieve it in batches on 10 instead, but it keeps looping over the first 10 fields. Is there a way to make it loop through everything, 10 at a time?

Please share your workflow

Information on your n8n setup

  • n8n version: Cloud
  • Database (default: SQLite): Default
  • n8n EXECUTIONS_PROCESS setting (default: own, main): Default
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Cloud
  • Operating system: Windows 10

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

hello @Primus

I see the node does not have an option to specify the start/end row. So in your case you will need some filter to exclude the processed items. E.g. where updated date was before today.

So the logic would be as follows:

  1. First Baserow node will return 30 items, which have updated day < today
  2. Second Baserow Node updates the selected rows, so their updated date would change to today
  3. The first Node retrieve the next 50 un-updated items
  4. etc

In that way all items will be processed iteratively (and you can always refresh the page to flush the memory):

Keep in mind that I’ve changes the settings in the first Baserow node.

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.