Best way to handle large file attachments in bulk workflows?

I’m running n8n on a self-hosted instance with the following setup:

sudo docker run -d --restart unless-stopped -it
–name n8n
-p 5678:5678
-e N8N_HOST=“example.com
-e WEBHOOK_URL=“https://example.com/
-e N8N_ENABLE_RAW_EXECUTION=“true”
-e NODE_FUNCTION_ALLOW_BUILTIN=“crypto”
-e NODE_FUNCTION_ALLOW_EXTERNAL=“”
-v /home/user/.n8n:/home/node/.n8n
n8nio/n8n

I frequently send bulk emails and WhatsApp messages with attachments (~5MB per file) to hundreds of recipients. I’m concerned that these files are held in RAM, which could crash the VM.

With a VM of 1GB of RAM and 30GB storage that might be problematic but even if I increase the limits at some point it will not be enough.

Is there a way to immediately discard these files after processing? Can this be managed with a specific node, or is there a setting to prevent n8n from holding them in memory or storing them unnecessarily? So something I have to set within the sudo docker setup.

Information on your n8n setup

  • n8n version: 1.85
  • Database (default: SQLite): SQLite
  • n8n EXECUTIONS_PROCESS setting (default: own, main): own
  • Running n8n via (Docker, npm, n8n cloud, desktop app): GCP docker
  • Operating system: Windows10

Hi, I don’t see the problem “of staying in memory”. As long as the file-size is not too large for actually 1 single processing loop it won’t most likely not be an issue. I think the most important is to manage the storage of the execution data. do you need to store that execution data of the succesful execution? do you need it for days ? etc. there is where your sqlite filesize will grow considerably.

Thats it. I dont need it at all. If I generate custom pdf presentations with 10MB per file it should be sent to the recipient as soon as that has been completed it can be discarded from the N8N environment. So how would I change that?

hi,

All the necessary env variables are documented here:

you can set them however you like, this is up to you

regards,
J.

Please mark my answer as the solution if it has helped you. thanks.

I assume you’re putting everything into one very large workflow.

Even if the execution data isn’t saved, it still remains until the execution is finished.

What might help is to move the creation and sending of a document into its own sub-workflow.

This keeps the main workflow free from binary data.

The sub-workflow stays smaller and also clears the memory after handling your document.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.