Hello everyone!
I’m running a workflow on my laptop using n8n in Docker Desktop to vectorize a large dataset of over 7,000 items. The workflow starts fine but becomes extremely slow and eventually gets stuck or hangs after processing around 1,000 items. I’ve tried batching and failed to make it work properly. Maybe this is the solution in the end.
My main goal is to process all 7k+ items efficiently without the workflow stalling. I’m looking for advice on how to optimize my workflow.
There is no specific error message. The n8n node becomes unresponsive, the numbers don’t update.
I’ve attached a snapshot of the problematic area in the workflow below,
Information on n8n setup
- n8n version: 1.108.1,
- Database (default: SQLite): default
- n8n EXECUTIONS_PROCESS setting (default: own, main): (I’m not sure, I have not explicitly set it up)
- Running n8n via (Docker, npm, n8n cloud, desktop app): Docker Desktop
- Operating system: Windows 11 Pro


