What is the workflow doing? Because you may be running out of memory. If that is the case, you can either upgrade your plan or split the workflow to use less memory.
Sorry to hear that you have problems! To get that fixed two steps are important:
Understand how n8n works
Data in an execution does never get “freed up”. Once it is loaded it will stay there until the workflow execution is finished. That said, we have planned to change that in the future. It will get a mode where data does get freed up after every node if gets enabled. This will then obviously come with a big performance hit but for such use cases, it should however not matter if an executio…
I believe this is probably an error with the memory.
Although you are handling only 50 items at a time, the whole execution will process all of the items, and this needs to be stored in a single big object in memory, as n8n needs to save it to the database.
The problem is that the execution history (with all the items processed) grows and needs to be serialized and deserialized. This is probably causing your system to crash.
I would recommend splitting this workflow into smaller parts so it c…
As n8n converts everything into JSON (and so key → value pairs) will the memory required be much higher than in SQLite.
You can do one of two things:
Increase RAM
Depending on your workflows and what you are doing you could split up the workflow into two different ones to decrease the memory footprint. One outer one that does not have much data and for example only loops and gives start-index to another sub-workflow which then does the rest of the work.