I read data from spreadsheet; it contains more than 25000 records.
But these data can’t handle to my code function. It shows n8n workflow time out…
How to handle large data sets.
I want to filter last 7 days data from me google sheet column C. How can i achieve this? I tried with split batches also. I haven’t knowledge to handle this as well.
Of course there is ways to get things done with large data sets. But we should always try to limit the amount of data where possible.
There is probably a way to filter the data coming from googlesheet for the last 7 days. Not sure though, as I always use a database if I am working with data like this.
To answer the question. You should probably get the data and then call a subworkflow to process it a smaller batch of items at a time. something like this
Make sure to clear your data at the end so it doesnt send all data back to the main workflow. you can do this by just adding a set node that keeps only set data. (and set it to run only once to be complete)