Uploading large (1+GB) files via AWS S3 node

Describe the problem/error/question

I’m attempting to upload a large (1.5GB) file via the AWS S3 node and it fails. When I try with a smaller file (a few megabytes) it works.

Where could I find out what is the hard limit on the upload size via the AWS S3 node? Is it possible to increase that limit?

What is the error message (if any)?

ERROR: UNKNOWN ERROR - check the detailed error for more information

Invalid array length

Please share your workflow

Share the output returned by the last node

NodeApiError: UNKNOWN ERROR - check the detailed error for more information
    at Object.requestWithAuthentication (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/NodeExecuteFunctions.js:1048:19)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
    at Object.awsApiRequestSOAP (/usr/local/lib/node_modules/n8n/node_modules/n8n-nodes-base/dist/nodes/Aws/S3/GenericFunctions.js:39:22)
    at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/n8n-nodes-base/dist/nodes/Aws/S3/AwsS3.node.js:514:44)
    at Workflow.runNode (/usr/local/lib/node_modules/n8n/node_modules/n8n-workflow/dist/Workflow.js:653:28)
    at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:590:53

Information on your n8n setup

  • n8n version: 0.222.2
  • Database (default: SQLite): MariaDB
  • n8n EXECUTIONS_PROCESS setting (default: own, main): own
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
  • Operating system: MacOS 12.6.1

Hi @Nurgak - thanks for getting in touch about this! @netroy is working on improving the handling of binary data in our nodes - can you perhaps share some additional insights on this, and if this is expected for large datasets?

This unfortunately a known issue with the S3 node. We’ve been updated most other nodes that upload files to use streaming and multi-part uploading whenever possible, but for the S3 node we are currently blocked because we are transitioning first from the deprecated SOAP API to REST API. Once that transition is done, we’ll update this node to handle files irrespective of their size.

The internal ticket regarding transitioning the API is NODE-219.
We’ll update this thread once the fix is out in a release.

1 Like

@netroy Thank you for the explanation. I understand and will be more patient.

In the case of nodes that take a while to execute for something quantifiable (for example when uploading large files), it would be nice to be able to see a progress animation that actually represents the state, perhaps in percentage. Even when not quantifiable the process state (defined on some server) could potentially be requested inside the node and shown to the user. I think that’d be a nice feature, but I digress…

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

New version [email protected] got released which includes the GitHub PR 6017.

1 Like