I have a workflow which must process a large data set coming in as response from an HTTP request. The HTTP Request node correctly outputs the large JSON (ca. 1 MB, array consisting of ~20k elements).
I then use an Item Lists node to split out the elements of the array. This is where the workflow gets stuck. It just keeps processing and never finishes. When I look up the execution status (hours later), it says “Unknown”.
Trying the same thing with a smaller data set (JSON with a few 100 elements), everything works just fine. So I suppose this is some kind of unofficial limitation of the Item Lists node. Is there a way to make this work for large data sets?
That sounds like the data is too large, and for that reason the n8n instance crashes. There are two ways to fix it:
upgrade to a larger cloud instance which has more memory
reduce the amount of data that gets processed at a given point in time. Normally gets done by splitting the workflow into a main- and a sub-workflow (older ticket which explains that)
Thanks for your reply. We are currently using the “Starter” plan. Do you have any information related to the memory available in the various plans and how it affects the data size limit? I’m hesitant to upgrade without knowing if it will help.
It starts with 320MB, and then each larger one doubles the available memory.
It is hard to say how much larger the files can be. But the additional memory can make a big difference as, obviously, already n8n takes up memory just for running, even if nothing executes. Meaning a certain amount is never available for the execution. But all the additional memory in the larger plans can so obviously be fully used for the execution.
You can also upgrade and see if it helps. If it does not, you can directly write our support to downgrade you again and to refund you the money (you can then also reference this community topic).
Thank you @jan for explaining. We have meanwhile implemented a different solution for this problem, but I will keep in mind what you said for future n8n projects that we might do.