I am auto-generating videos at scale. I have 70 videos right now that are about 6-8mb each, and I made my workflow work perfectly when I did this for one video. The problem is that n8n times out when I try to run the node for 70 videos
Hi @buildingthings, I am very sorry you’re having trouble.
To avoid hitting any resource limits when processing a large number of files I suggest you split your workflow into two separate workflows here: One “parent” (fetching your Sheet with the individual URLs) and one “child” (doing the heavy lifting of first downloading a file, then uploading it to your Google Cloud Storage).
You can then use the Split In Batches node in your parent workflow and split your data into small batches of maybe 5 URLs at a time. Your parent would then call the child workflow through the Execute Workflow node.
The advantage of this approach is that all resources required by the child workflow execution would become available again after each child workflow execution finishes, provided you only return a very small (or possibly empty) result to the parent. So instead of having to keep 70 videos in memory at once, your n8n instance only needs to keep 5 videos in memory at once.
Here’s how this could look like:
Parent workflow
Child workflow
On a slightly related note, you probably want to make sure to set the N8N_DEFAULT_BINARY_DATA_MODE=filesystem
environment variable to avoid using your memory for keeping large amounts binary data