Hello
I have a list of URLs in a Google Sheet, around 200 in total. I use a loop node to process them in batches of 3 items each. However, the process stops at 33 items and shows as successfully completed (not failed). Iām not sure why this might be happening ā could it be due to running out of memory? I am hosting this on a 1GB server.
Is there any way to optimize this workflow? My current process is to fetch the URL list, processing them in batches of 3 items, taking a screenshot, and converting the image file to base64 for Google Vision AI to read. And then update the Sheet Cell with the output data.
I suspect the base64 data consumes too much RAM. And any possible solution on this?
Can you share the worklfow json so we can see what is happening? Normally the best option when using a loop is to use a sub workflow for the processing that way once the subworkflow has finished the memory is released for the next loop.
Hello. Turns out the loop node stopped was caused by no output data from the HTTP node. I have resolved it by enabling always output data option.
Thanks