Sorry to hear that you have problems! To get that fixed two steps are important:
-
Understand how n8n works
Data in an execution does never get “freed up”. Once it is loaded it will stay there until the workflow execution is finished. That said, we have planned to change that in the future. It will get a mode where data does get freed up after every node if gets enabled. This will then obviously come with a big performance hit but for such use cases, it should however not matter if an execution is slower, as long as it runs successfully. After that, your plan “A” should also work. Sadly no ETA for it. -
Using that understanding to fix the workflow
You have to make sure that the data does never end up in an execution. Your plan “B” should theoretically work totally fine. But I assume that the data of each of the iterations ends up again in the main worflow. Most likely because the last node of the second workflow (which does the import) still contains the whole data. If that is the case it will then send this whole data back to the main workflow. So the solution would be to make sure that the last node sends back as little data as possible. Best a single empty JSON. The easiest and most efficient way to do that, is adding a Set-Node two the second workflow. You have to make sure to activate the option “Keep Only Set” and under “Settings” additionally “Execute Once”.
Hope that helps!
Example Set-Node: