I am looking for the best way to download a large file (a tar file of a directory on my website host - not in public_html) and move it to a OneDrive folder.
I SSH into the host, create the tar but not sure what the best workflow is to get the file and upload it.
If I download the file, where is it stored?
I could move the file to the FTP directory on. my host then download it via FTP but not sure how to get it up to onedrive (as there is a size limit and my file might be > 1GB).
If I download it and can access it I could sync it to OneDrive using rsync or similar outside of n8n.
My n8n install is on an old linux laptop using docker.
It sounds like it is all possible which is handy, What you could do is use the SSH node to access the server and create the archive then use the FTP node and set it to SFTP and use that to download it.
Where it gets tricky is I don’t think n8n writes the file to disk until you tell it to so it may sit in memory (@MutedJam may need to confirm this), so it would be a case then of adding in the Onedrive node and telling it to use the data property and you should be sorted.
If the file is larger than your limit then it may be a case of splitting the archive into chunks on the first step and uploading the chunks for usage later.
Yes, as @Jon said n8n would keep data in memory, so working with large files or huge datasets can become a problem really fast, especially when using multiple nodes or manually running workflows. So you’d need to monitor your memory usage very closely to avoid running into out of memory errors.
Personally, I’d only use n8n to execute commands for uploading such large files onto a cloud storage rather than actually fetching them in n8n. So my suggested approach would be to set up something like rclone on the host (which would handle compression and chunking for you when needed), then configure the OneDrive connection in there.
That way, you’d still be able to use n8n to control the flow and handle errors through the SSH node, but wouldn’t have to worry about the memory consumption.
On the n8n front I know @kik00 was looking into improving handling of large files (but I’m currently keeping him busy with cloud-related requests). So the above suggestion might change in the future
I did setup the sftp and downloaded a small test file which it stored as a binary file so couldn’t get to it.
I have used rclone before so will add that to my linux laptop where n8n is installed (can’t add it to the shared website host) and play with controlling it via its API to initiate the download from my host and upload to OneDrive as suggested.
I did setup the sftp and downloaded a small test file which it stored as a binary file so couldn’t get to it.
This sounds somewhat unexpected to me though tbh. While I think large files will cause trouble, the process of downloading a file from FTP, then uploading it to OneDrive should still work like so:
If it helps anyone, I got rclone working as a docker instance on the same machine.
I did it that way as the docker instance also starts the remote control and web GUI (thinking no-code access) automatically.
If this seems like a solution for large file transfers, there is a helpful page here -
Now the flow runs, reads from a database
Creates a backup on the remote system via SSH.
Creates a new folder on OneDrive,
Transfers the backup file from host to new OneDrive folder using rclone as it handles big files best.
and a few other bits and bobs.
ahhhh it took me a really long time to understand what you wanted to tell me. But the solution was too easy. I just created a folder in OneDrive via the n8n node and I got exactly the response you sent me. 1000 thanks.
Greetings Jens