Large file processing memory issue

My workflow processes CSV files with 10,000 rows from Google Drive . sometimes n8n runs out of memory or times out during processing

Hi @Rhon .. please is it giving any error kindly screenshot and share

Or you can instead

Implement a chunked processing using the split batches nodes, you can set the batch sizes to from 500 to 1000 items

1 Like

After the Google drive node fetches the csv , use the the csv node to parse it then immediately connect to split in batches ..

I think if you try this :thinking: .. you should be able to maneuver your way around but then if you are still confused please share you screen

1 Like

oh hi, yeah i thought as much but i was being skeptical i would try splitting

And please don’t forget to like and mark me as solution if I helped .. please come back to help it won’t take time :folded_hands:.. so this would also help others having the same issue

1 Like

alright thanks

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.