I’m experiencing an issue where my workflow successfully processes items in a Loop Over Items node, but after reaching the “done” state with large datasets (667 items), it stagnates and the next node spins forever without completing. The same workflow works perfectly when I reduce my dataset to just 6 items, but fails consistently with my full 667-item dataset.
Specifically: I take a list of sales data for the day, loop over it writing and updating entries in a PostgreSQL database. For the 667, it upserts 667 times, but only creates 131 unique rows.
I’m getting 403 errors trying to post the code. I think maybe because it’s so much data, so I tried to cut the data down and post the code as well, and it’s still not posting. So I’m not sure how to share the code with you guys.
What is the error message (if any)?
There’s no specific error message - the workflow simply hangs/stalls after the loop completes. The next node just spins indefinitely without proceeding.
I need to process a full day’s worth of sales data (667 items]. Loop over items takes somewhere around 25-45 seconds to get through the whole day. However, there is probably a better way to do this. I’m just a nOOb and unexperienced with how to work with larger data sets.
Information on my n8n setup
- n8n version: 1.88.0
- Database: PostgreSQL on Supabase
- n8n EXECUTIONS_PROCESS setting: own
- Running n8n via: Self-hosted on Hostinger
- Operating system: Linux
I’ve check ram and cpu usage while running the workflow and it doesn’t spike at all.
Additional Notes: I actually have over a thousand days to process with hundreds to thousands of items to process each day. It comes to around 500,000 items. While the specific transactions on the day don’t need to happen in order as they’re upsertting the master table the days do need to happen in order. Any advice on processing this much data would be helpful.