Download large file and upload to OneDrive

Yes, as @Jon said n8n would keep data in memory, so working with large files or huge datasets can become a problem really fast, especially when using multiple nodes or manually running workflows. So you’d need to monitor your memory usage very closely to avoid running into out of memory errors.

Personally, I’d only use n8n to execute commands for uploading such large files onto a cloud storage rather than actually fetching them in n8n. So my suggested approach would be to set up something like rclone on the host (which would handle compression and chunking for you when needed), then configure the OneDrive connection in there.

That way, you’d still be able to use n8n to control the flow and handle errors through the SSH node, but wouldn’t have to worry about the memory consumption.

On the n8n front I know @kik00 was looking into improving handling of large files (but I’m currently keeping him busy with cloud-related requests). So the above suggestion might change in the future :slight_smile:

1 Like