How to handle large data sets?

How to handle large data set?

I read data from spreadsheet; it contains more than 25000 records.
But these data can’t handle to my code function. It shows n8n workflow time out…

How to handle large data sets.

I want to filter last 7 days data from me google sheet column C. How can i achieve this? I tried with split batches also. I haven’t knowledge to handle this as well.

test workflow to practice split batches:

can you guys help me to handle large data sets?

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

1.33.2, n8n cloud

Hi @sasanka_deshapriya

Of course there is ways to get things done with large data sets. But we should always try to limit the amount of data where possible.
There is probably a way to filter the data coming from googlesheet for the last 7 days. Not sure though, as I always use a database if I am working with data like this.

To answer the question. You should probably get the data and then call a subworkflow to process it a smaller batch of items at a time. something like this
image
Make sure to clear your data at the end so it doesnt send all data back to the main workflow. you can do this by just adding a set node that keeps only set data. (and set it to run only once to be complete)

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.