Best approach for large binary files?

Hi There,

I was just looking for advice on how to handle my specific situation:

I’m using n8n to upload fairly large (10GB) videos to YouTube. To do this, I use the Read Binary Files node before handing the output to the YouTube node…

I’ve had to use:

      - N8N_DEFAULT_BINARY_DATA_MODE=filesystem

Because I don’t have enough RAM (I think i’d need more than the 12GB I have)

The questions I had:

Is it worth upgrading my RAM? For example if I had 16GB this would cover off a big upload (as long as one at a time) - does Binary Data get removed after workflow end or would I have to use something like:

     - EXECUTIONS_DATA_PRUNE=true
     - EXECUTIONS_DATA_PRUNE_MAX_COUNT=1
  • Or another way to avoid n8n saving the execution in history?
  • Or best practice to set a custom location to save binary data to (so I could use an old drive I don’t care about wearing out - maybe in docker with
    volumes:
      - /DriveIDontCareAbout:/home/node/.n8n/binaryData

EDIT: Actually I don’t think this would work, instead i’d probably want to mount a thumb drive or something on the host onto /home/myuser/docker/n8n/binaryData via mount --bind

I just wanted to see if there was a better way to do what I’m doing - as i’m concerned about constantly writing 10GB files to my SSD and removing them straight away!

Many thanks!

@UnluckyForSome Looks good, I would avoid usb storage, ure offsetting to filesystem is great, should limit ram usage and work in 16gb instance. If you can 32gb would be better (if using docker make sure u change the ram allocation),

You could use a hdd for bin data, but may see slighly performace but tbh likely okay.

pruning should work, am not sure if this mean data in the db, compared to RAM / Disk (ofcourse db uses space depending where hosted.).

You shouldn;t have any issues unless ure running a super cheap ssd, obcourse make sure ure running backs up too os and n8n just for safety.

Oh and you could look into N8N_CONCURRENCY_PRODUCTION_LIMIT to limit to 1 so you dont overload in concurrency depending on trigger etc

Samuel
@UnluckyForSome hope this helps.

Samuel

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.