I have an orchestrator workflow that iterates over a list of URLs and passes each URL to a child workflow. The child workflow then loads the raw html, extracts data (htmlExtractContent) and safes the data to Airtable.
Every main execution of the orchestrator workflow the whole workflow just stops after iteration 9 to 12.
I tried this numerous times, also using a timer so the orchestrator workflow is not attached to my browser.
None. However I can see that the child workflow execution always dies when attempting to extract html data.
Since this occurs always in the same range of loop iteration count while extracting html data I think this might be some sort of memory overflow error. However, I don’t know where exactly and how to fix this. The child workflow only returns a single URL. My guess would be that the data extracted from the html in the loop iterations is not freed up. My understanding was that each child workflow has it’s own memory that should be cleared after its execution but maybe this is not the case? Or maybe some context data leaks to the main workflow?
- **n8n version: 1.1.1 Cloud