I have created several workflows that copies data from HTTP Request.
When I run it manually - everything works fine (it takes no more than a minute to finish it).
I built one workflow that executes those workflows one by one - and schedule that to be run once an hour.
And here is the problem. Almost every time this workflow crashes due to the out of memory error (usual at the last wokflow) and when I check this workflow it runs 40-50 minutes before it crashes and it crashes on the HTTP Request (which receives only 500 records).
And again - when I run it manually - it takes 15 seconds and whole workflow is finished.
Can I do something to avoid such error?
In general, the n8n memory will clear itself when a sub workflow finishes but you need to make sure you don’t return everything to the parent workflow.
You could try to set a Set Node at the end of each sub-workflow that will run once and only return whats absolutely necessary to the parent workflow.
If it’s just getting a task done like in your BigQuery workflow, you could simply “finish off” the workflow like so: