Hey yall! I’m a long time coder - first time n8n user. I’m really enjoying the system so far but am having trouble with one of the tasks I’m trying to automate. Basically I take backups of a part of my system which end up being around 20GB+ zipped and I’m trying to upload them to an s3 compatible service. My workflow was fine for small test files but when I tried the large file my entire system locked up. I had to power cycle just to escape. My fear is when I ran the Read/Write node is tried loading the entire 20GB+ file into memory (of which I don’t have 20GB+ of) and crashed the system.
My question is what is the protocol for uploading massive files? Is it just not possible? I’m very new so its totally possible there’s a node that I’m missing.
Thank you so much for the quick reply @barn4k. This got me past the read file node however my S3 node now fails with the following (maybe there is an environment variable to change this?):
n8n version
1.93.0 (Self Hosted)
Stack trace
RangeError: File size (19977593782) is greater than 2 GiB
at readFileHandle (node:internal/fs/promises:536:11)
at FileSystemManager.getAsBuffer (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/binary-data/file-system.manager.js:47:16)
at BinaryDataService.getAsBuffer (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/binary-data/binary-data.service.js:140:20)
at getBinaryDataBuffer (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/execution-engine/node-execution-context/utils/binary-helper-functions.js:62:12)
at Object.getBinaryDataBuffer (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/execution-engine/node-execution-context/execute-context.js:33:69)
at ExecuteContext.execute (/usr/local/lib/node_modules/n8n/node_modules/n8n-nodes-base/dist/nodes/S3/S3.node.js:717:22)
at WorkflowExecute.runNode (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/execution-engine/workflow-execute.js:696:27)
at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/execution-engine/workflow-execute.js:930:51
at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/execution-engine/workflow-execute.js:1266:20
Unfortunately I think uploading requires the S3 CLI unless I get very technical and pre-sign every chunk with CURL or something crazy.
I found this thread with a user mentioning my same issue - seems it’s in the system but has not been addressed. May try and implement it myself or something in the meantime.
Yes, for that one to work you will need to create a multipart upload. Honestly, the easiest option would be to use the aws lambda function with python and boto3 library. Where you will pass an url via the lambda node and the fuction will upload it on the S3 bucket and return the bucket+key as an output.