What is the maximum number items a node can handle?

I have a workflow that takes more than 50K items, the workflow takes hours to be executed successfully but with no reason at night the workflow just stops working at a node successfully but it does not continue to the next node.

If I started the workflow with 10K, it takes 3 to 4 hours and it get executed successfully. But when I keep it running through the night it just stops at a certain node and don’t continue.

The main limitation will be the memory. So there is no certain item limit just how much memory gets used by the items and there it does not matter if it is 1 item or 1 million, it is the combined memory they use (across all nodes). So it could break with 1 item if it contains a huge amount of data and work with 1 million if the amount of data in each is very small.

Can you please provide some further information about what is happening if it does not work? Is there for example some kind of error message?

Generally is breaking things down and using sub workflows normally the solution. As then not all data of all the items has to be kept in memory the whole time.

It usually stops with the error (Workflow execution process did crash for an unknown reason), and the execution is successful. But not all the batches are done.

So I used to have a node which will take a number of items and when the node is done it just stops and don’t continue to the next node were everything is green.

If I accessed the node that did the execution all the values are correct and seems good, but didn’t pass these items to the next node.

If the number of items is small, it will work perfectly. This issue is only when the number of items are huge.

So if the issue is with the memory,
Is there a possibility that when a workflow just stops retrigger it again ?
Or increase the memory size ?

Or is it possible to clear the memory once the batch is done ?