(SplitInBatches) Workflow execution process did crash for an unknown reason!

I believe this is probably an error with the memory.

Although you are handling only 50 items at a time, the whole execution will process all of the items, and this needs to be stored in a single big object in memory, as n8n needs to save it to the database.

The problem is that the execution history (with all the items processed) grows and needs to be serialized and deserialized. This is probably causing your system to crash.

I would recommend splitting this workflow into smaller parts so it can easily run without memory problems. This would require you to create a few different workflows.

Workflow A:

  • Query Oracle to get the item IDs only. Split in batches and use Execute Workflow to send each batch to a separate workflow

Workflow B:

  • Based on the provided IDs from workflow A, get the full data from Oracle, works on it and you should add one last Set node, clearing all data and simply setting a simple return value, such as
{
    "success": true
}

The result from workflow B is passed back to workflow A as part of the result. This will not cause any problems as you’ll have the following scenario:

  • Workflow A contains only a list of IDs and a simple success output from workflow B for each iteration
  • Workflow B contains multiple items but it’s fine as it was split in batches

With this, you should not have any memory issues I believe.

3 Likes