Describe the problem/error/question
I’m attempting to upload a large (1.5GB) file via the AWS S3 node and it fails. When I try with a smaller file (a few megabytes) it works.
Where could I find out what is the hard limit on the upload size via the AWS S3 node? Is it possible to increase that limit?
What is the error message (if any)?
ERROR: UNKNOWN ERROR - check the detailed error for more information
Invalid array length
Please share your workflow
Share the output returned by the last node
NodeApiError: UNKNOWN ERROR - check the detailed error for more information
at Object.requestWithAuthentication (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/NodeExecuteFunctions.js:1048:19)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at Object.awsApiRequestSOAP (/usr/local/lib/node_modules/n8n/node_modules/n8n-nodes-base/dist/nodes/Aws/S3/GenericFunctions.js:39:22)
at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/n8n-nodes-base/dist/nodes/Aws/S3/AwsS3.node.js:514:44)
at Workflow.runNode (/usr/local/lib/node_modules/n8n/node_modules/n8n-workflow/dist/Workflow.js:653:28)
at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:590:53
Information on your n8n setup
- n8n version: 0.222.2
- Database (default: SQLite): MariaDB
- n8n EXECUTIONS_PROCESS setting (default: own, main): own
- Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
- Operating system: MacOS 12.6.1