Stoping Workflows With Batch

Describe the problem/error/question

My workflow go through split in batches but never stop. I have seen some documentation where people can set a second branch when the batch ends, bur I don have this on ours

What is the error message (if any)?

Please share your workflow

{
“meta”: {
“instanceId”: “1b16b1df538ba12dc3f97edbb85caa7050d46c148134290feba80f8236c83db9”
},
“nodes”: [],
“connections”: {}
}

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node


It never stops

Information on your n8n setup

  • n8n version: 0.222.3
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app): AWS
  • Operating system:

Hi @Felipe_Gomez_O, I am sorry you’re having trouble.

What exactly do you mean by your workflows never stops? Could you perhaps share an actual workflow using which the problem can reproduced rather than just a screenshot?

Do you see any errors in your server logs?

Thanks, the workflow dos not stops or that what I understand, as you can see in my workflow, the loop goes ok trough the batches but at the end it wont finish but will error with memory problems.

From looking at your screenshot it seems the error message might be spot on - you’re simply processing too much data for your n8n instance to handle. The situation is documented here, along with a few common workarounds:

If you control the amount of data directly you could simply consider processing multiple smaller files in separate executions instead of one big file in a single execution.

In case you can’t change the input file size you could try something like this instead: Convert Binary file To JSON or CSV - #5 by MutedJam This is rather cumbersome though and you might have a better time simply increasing the available memory on your machine.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.