Hello, my question is regarding the loop. My file contains many lines, and I would like to save them in the database every 1000 lines, because when I try to save everything, N8N crashes.
I did it the way you suggested, but for some reason the extract from csv node drops the container, it keeps loading infinitely until the container drops.
The problem I’m facing is related to memory.
I tried allocating more memory using the variable NODE_OPTIONS=–max-old-space-size=8192, but it’s still not sufficient. Is there any other way to solve this?
I would like to split the extract from the file to run in batches of a thousand lines.
Have you tried using the offset options in the extract node? So I would maybe save the binary file to disk while you work on it then call a sub worklfow that reads the file and updates mysql then return the number of items of worked on and loop back round to the execute workflow node.
Can you, based on this code, create a visual example for me, please? I’m a beginner and don’t understand the context very well. I’ve been dealing with this issue for three days.
As mentioned it was never going to just work and was just a rough example for you to fill in the blanks. In this case the error is telling you the node doesn’t exist which is correct as the node is actually called Execute Workflow Trigger, I also missed out the .item part as well so the correct expression would be something like {{ $('Execute Workflow Trigger').item.json.startRow }}.
Hey @Jon ,
Thank you for the response, I had already managed to solve it. I am conducting some tests, and as soon as I finish, I will share the solution with anyone who has the same problem.
Thank you for your patience.