AWS S3 Download file

Describe the problem/error/question: I am having trouble downloading a file from S3. Is there a size limit?

What is the error message (if any)? This is the only error message I can see when the node times out.

{
“errorMessage”: “Execution stopped at this node”,
“errorDescription”: “n8n may have run out of memory while running this execution. More context and tips on how to avoid this in the docs”,
“errorDetails”: {},
“n8nDetails”: {
“time”: “11/19/2025, 3:54:20 PM”,
“n8nVersion”: “1.120.4 (Cloud)”,
“binaryDataMode”: “filesystem”
}
}

Please share your workflow

{
“nodes”: [
{
“parameters”: {
“bucketName”: “baingram.com”,
“fileKey”: “={{ $json.actualFinalKey }}”
},
“type”: “n8n-nodes-base.awsS3”,
“typeVersion”: 2,
“position”: [
864,
-16
],
“id”: “e555be3a-ba94-462c-9ec5-955fa494c425”,
“name”: “Download Latest File”,
“retryOnFail”: true,
“maxTries”: 5,
“waitBetweenTries”: 2000,
“credentials”: {
“aws”: {
“id”: “xxxx”,
“name”: “AWS account”
}
}
}
],
“connections”: {
“Download Latest File”: {
“main”: [

]
}
},
“pinData”: {},
“meta”: {
“templateCredsSetupCompleted”: true,
“instanceId”: “87c83e1febda172687fcb8ce9a00c9b69b2af0ad40bf372f4650ae8afc0ddebd”
}
}

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • n8n version: 1.120.4
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app): n8n cloud
  • Operating system:

The error indicates n8n is running out of memory when downloading your file, which suggests it’s quite large. n8n Cloud has memory limitations that can cause timeouts with large files (typically around 500MB-1GB depending on your plan).

Here are some solutions to try:

• **Check your file size first** - Use the AWS S3 node’s “Get Object Info” operation to see the file size before downloading

• **Split large files** - If possible, break your file into smaller chunks before uploading to S3

• **Use streaming approach** - Consider using the HTTP Request node with the S3 pre-signed URL instead, which can handle larger files more efficiently

• **Upgrade your plan** - n8n Cloud’s higher tiers have more memory available for processing large files

If you’re dealing with files over 1GB regularly, you might need to consider self-hosting n8n where you have more control over memory limits.