Hi There,
I was just looking for advice on how to handle my specific situation:
I’m using n8n to upload fairly large (10GB) videos to YouTube. To do this, I use the Read Binary Files node before handing the output to the YouTube node…
I’ve had to use:
- N8N_DEFAULT_BINARY_DATA_MODE=filesystem
Because I don’t have enough RAM (I think i’d need more than the 12GB I have)
The questions I had:
Is it worth upgrading my RAM? For example if I had 16GB this would cover off a big upload (as long as one at a time) - does Binary Data get removed after workflow end or would I have to use something like:
- EXECUTIONS_DATA_PRUNE=true
- EXECUTIONS_DATA_PRUNE_MAX_COUNT=1
- Or another way to avoid n8n saving the execution in history?
- Or best practice to set a custom location to save binary data to (so I could use an old drive I don’t care about wearing out - maybe in docker with
volumes:
- /DriveIDontCareAbout:/home/node/.n8n/binaryData
EDIT: Actually I don’t think this would work, instead i’d probably want to mount a thumb drive or something on the host onto /home/myuser/docker/n8n/binaryData via mount --bind
I just wanted to see if there was a better way to do what I’m doing - as i’m concerned about constantly writing 10GB files to my SSD and removing them straight away!
Many thanks!