Item List node gets stuck when processing large data set

I have a workflow which must process a large data set coming in as response from an HTTP request. The HTTP Request node correctly outputs the large JSON (ca. 1 MB, array consisting of ~20k elements).

I then use an Item Lists node to split out the elements of the array. This is where the workflow gets stuck. It just keeps processing and never finishes. When I look up the execution status (hours later), it says “Unknown”.

Trying the same thing with a smaller data set (JSON with a few 100 elements), everything works just fine. So I suppose this is some kind of unofficial limitation of the Item Lists node. Is there a way to make this work for large data sets?

I’m using n8n cloud, version 0.216.2.

Welcome to the community @tweety!

That sounds like the data is too large, and for that reason the n8n instance crashes. There are two ways to fix it:

  • upgrade to a larger cloud instance which has more memory
  • reduce the amount of data that gets processed at a given point in time. Normally gets done by splitting the workflow into a main- and a sub-workflow (older ticket which explains that)

I hope that is helpful!

Hi @jan,

Thanks for your reply. We are currently using the “Starter” plan. Do you have any information related to the memory available in the various plans and how it affects the data size limit? I’m hesitant to upgrade without knowing if it will help.

Best wishes,
Thomas