I have a large dataset in Baserow, and I need to move the content of some column to other columns.
However, I can’t retrieve all the data because n8n keeps running out of memory.
I want to retrieve it in batches on 10 instead, but it keeps looping over the first 10 fields. Is there a way to make it loop through everything, 10 at a time?
I see the node does not have an option to specify the start/end row. So in your case you will need some filter to exclude the processed items. E.g. where updated date was before today.
So the logic would be as follows:
First Baserow node will return 30 items, which have updated day < today
Second Baserow Node updates the selected rows, so their updated date would change to today
The first Node retrieve the next 50 un-updated items
etc
In that way all items will be processed iteratively (and you can always refresh the page to flush the memory):
Keep in mind that I’ve changes the settings in the first Baserow node.