Seeking advice on optimizing a loop workflow to resolve an out-of-memory issue

I have a list of URLs in a Google Sheet, around 200 in total. I use a loop node to process them in batches of 3 items each. However, the process stops at 33 items and shows as successfully completed (not failed). Iā€™m not sure why this might be happening ā€“ could it be due to running out of memory? I am hosting this on a 1GB server.

Is there any way to optimize this workflow? My current process is to fetch the URL list, processing them in batches of 3 items, taking a screenshot, and converting the image file to base64 for Google Vision AI to read. And then update the Sheet Cell with the output data.
I suspect the base64 data consumes too much RAM. And any possible solution on this?

Thank you so much

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hey @Blueli,

Can you share the worklfow json so we can see what is happening? Normally the best option when using a loop is to use a sub workflow for the processing that way once the subworkflow has finished the memory is released for the next loop.

Hello. Turns out the loop node stopped was caused by no output data from the HTTP node. I have resolved it by enabling always output data option.

1 Like