Describe the problem/error/question
I am trying to iterate through a list of ~600 or so mp3 files to download via an HTTP request and copy to a cloud storage location. I have tried splitting the load into batches, so that not more than 10 or so recordings are in memory at a time, however, every time it gets to file number 400 or so, n8n crashes and spits out a memory issue. (usually in the form of a 502, 503, 505, or 404 error)
I have tried batching the load, and even running each batch as a separate execution, but no luck. Is there any way to get something like this pushed through n8n, or is n8n just not the right tool for this job?
What is the error message (if any)?
Please share your workflow
I have also tried this using the following ways
Share the output returned by the last node
Information on your n8n setup
n8n version: 0.230.3
Database (default: SQLite): default
n8n EXECUTIONS_PROCESS setting (default: own, main): default
Running n8n via (Docker, npm, n8n cloud, desktop app): n8n cloud
Operating system: MacOS Ventura 13.3.1
hi, have you tried to use a local version of n8n?
We have a self-hosted version and were hitting this limit, so we opted to try the cloud version.
The self-hosted version was getting a bit further before crashing but ultimately suffered the same problem.
Maybe you can try to add the command for save the file in the fileSystem
follow this link
Thanks for that suggestion! Unfortunately, that made no difference.
I do see some mention of batching, but are you also using sub workflows? as just batching is not enough to clear the data before going for the next batch.
I have tried that as well, but it only gets about two or three loops in when I do it that way.
Then u are probably not clearing the data before sending it back to the main flow.
This may be a silly question, but how can I set it to do that?
You could use a Set node before finishing your sub-workflow returning only a single empty (or very small) item. Like the “Prepare response” node in this example.