I’m experiencing an issue where the Loop Over Items node stops processing and doesn’t reach the Done Branch when the input item count exceeds a certain threshold.
Working scenarios:
-
Input: 50 items or fewer with Batch Size: 50 → Reaches Done Branch

-
Input: 74 items with Batch Size: 50 → Reaches Done Branch

Failing scenarios:
-
Input: 51-100+ items with Batch Size: 50 → Does NOT reach Done Branch

-
Input: 108 items with Batch Size: 50 → Stops after processing 100 items, never reaches Done Branch

Key observations:
-
When processing 108 items, Loop Over Items shows “100 items total” instead of “108 items total”
-
The workflow stops at the loop branch without any error messages
-
This workflow previously worked with 1000+ items, but recently started failing
-
No workflow changes were made before the issue appeared
Workflow structure:
Extract from File (CSV) → Loop Over Items (Batch Size: 50)
↓ loop
Aggregate → AI Model → Code
↓ done
[Next processing steps]
Questions:
-
Is there a known limitation or bug where Loop Over Items fails to reach Done Branch when input exceeds batch size significantly?
-
Why does it work with 74 items but fail with 108 items?
-
Could there be an internal limit (e.g., 100 items) affecting the behavior?
What is the error message (if any)?
No error message is displayed. The workflow simply stops executing after the loop branch, and the done branch never triggers.
Please share your workflow
json
[Paste your workflow JSON here - select all nodes, press CTRL+C/CMD+C, then paste]
Share the output returned by the last node
When input is 74 items (SUCCESS):
-
Extract from File: 74 items
-
Loop Over Items done output: 74 items total
-
Successfully proceeds to done branch
When input is 108 items (FAILURE):
-
Extract from File: 108 items
-
Loop Over Items shows: 100 items total (not 108!)
-
Does not proceed to done branch
-
Last executed node: Loop Over Items (loop branch only)
Expected output:
-
Loop Over Items should show “108 items total”
-
Should proceed to done branch after all loops complete
Information on your n8n setup
-
n8n version: [1.119.1]
-
Database (default: SQLite): [PostgreSQL]
-
n8n EXECUTIONS_PROCESS setting (own]
-
Running n8n via (Docker, npm, n8n cloud, desktop app): [Docker]
-
Operating system: [Alpine Linux]
Additional context:
-
Loop Over Items Batch Size setting:
50(changed from{{ 50 }}to direct numeric value, no improvement) -
Temporary workaround: Reducing Batch Size to 20-25 works, but doesn’t explain why larger batch sizes fail
-
The issue appeared suddenly without any configuration changes



