I have a workflow which must process a large data set coming in as response from an HTTP request. The HTTP Request node correctly outputs the large JSON (ca. 1 MB, array consisting of ~20k elements).
I then use an Item Lists node to split out the elements of the array. This is where the workflow gets stuck. It just keeps processing and never finishes. When I look up the execution status (hours later), it says “Unknown”.
Trying the same thing with a smaller data set (JSON with a few 100 elements), everything works just fine. So I suppose this is some kind of unofficial limitation of the Item Lists node. Is there a way to make this work for large data sets?
Thanks for your reply. We are currently using the “Starter” plan. Do you have any information related to the memory available in the various plans and how it affects the data size limit? I’m hesitant to upgrade without knowing if it will help.