Optimize working with 500MB+ binaries to upload to FTP

Describe the problem/error/question

Currently I am facing out of memory issues with files that I want to upload to a FTP server that are greater than 500MB.

Reading through the code I see that it is possible to obtain a readableStream from the file which should solve the problem. I also locally tested with “promise-ftp” and that should upload it correctly.

I have two questions: how can I prevent reading the binary of 500MB in memory and if someone has optimization tips in general, they are always welcome.

About my workflow: it is a workflow that accepts a HTTP request and downloads a collection of video files from a S3 bucket. After that all those video files are combined into one video file and finally, it will be uploaded to a ftp server.

Please share your workflow

Share the output returned by the last node

Information on your n8n setup

  • n8n version: 1.25.0
  • Database (default: SQLite): Postgresql
  • n8n EXECUTIONS_PROCESS setting (default: own, main): own
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker (via DigitalOcean droplet)
  • Operating system: Linux

Try to store binary data to disk because the default method is storing them to memory

1 Like

Thank you for your response! I have set “N8N_DEFAULT_BINARY_DATA_MODE” already to “filesystem”. As far as I understand this is only for writing and not for reading a binary, right?

The problem I am mentioning is that I am required to read the binary first before being able to upload it to the FTP server. The read is most likely causing the out of memory issues.