Error on copying files from Gmail to S3

Hi,

I have a workflow this way:

When I recover emails from Gmail everything goes ok. I have configured:
Format: Resolved (this gets all attachments for every email)

The error appears on S3 step when uploading attachment_0 property file. I get next error:

Could you test it to ensure that everything goes ok?

Thanks!

@Miquel_Colomer just looked into this and it’s working for me as expected. I tested it with an image attachment though. I wonder what attachment type did you use? Also which binary property name did you use in the s3 node? Are you filtering the data in the Gmail node just to retrieve emails with attachments? if no you can set the property query to has:attachment. That or you will need a function node right after the Gmail node to filter.

A picture of the s3 node and the binary tab of the Gmail node will help.

Hi @RicardoE105,

Emails returned are filtered by query on Gmail (faked email)
from:[email protected] has:attachment newer_than:1d

S3 uses attachment_0 binary property.
image

I am recovering up to 8 emails with attached excels (.xlsx and one per email) every day to upload sequentially to s3.

I have generated a new s3 access/secret pair, and returns commented error, again :frowning:

@Miquel_Colomer Have been trying to replicate the issue without success. Even used .xlsx files as attachments. Can you please share the workflow with me? Also, what is the function node doing?

Sure!

{
  "name": "copy_attachments_to_s3",
  "nodes": [
{
  "parameters": {},
  "name": "Start",
  "type": "n8n-nodes-base.start",
  "typeVersion": 1,
  "position": [
    300,
    410
  ]
},
{
  "parameters": {
    "operation": "upload",
    "bucketName": "bucket-name",
    "fileName": "={{$node[\"Get Last Emails\"].binary.attachment_0.fileName}}",
    "binaryPropertyName": "attachment_0",
    "additionalFields": {}
  },
  "name": "AWS S3",
  "type": "n8n-nodes-base.awsS3",
  "typeVersion": 1,
  "position": [
    930,
    410
  ],
  "credentials": {
    "aws": "s3"
  }
},
{
  "parameters": {
    "functionCode": "var d = new Date(item.date);\nvar year = d.getFullYear();\nvar month = d.getMonth() + 1;\nvar day = d.getDate();\n\nmonth = month < 10 ? \"0\" + month : month;\nday = day < 10 ? \"0\" + day : day;\nitem.formatted_date = year + \"-\" + month + \"-\" + day;\nitem.id = item.messageId.replace(\"<\", \"\").replace(\">\", \"\");\n\nreturn item;\n"
  },
  "name": "FunctionItem",
  "type": "n8n-nodes-base.functionItem",
  "typeVersion": 1,
  "position": [
    740,
    410
  ]
},
{
  "parameters": {
    "triggerTimes": {
      "item": [
        {
          "mode": "custom",
          "cronExpression": "0 15 8 * * 1-5"
        }
      ]
    }
  },
  "name": "Cron",
  "type": "n8n-nodes-base.cron",
  "typeVersion": 1,
  "position": [
    300,
    580
  ]
},
{
  "parameters": {
    "resource": "message",
    "operation": "getAll",
    "returnAll": true,
    "additionalFields": {
      "format": "resolved",
      "q": "=has:attachment newer_than:1d"
    }
  },
  "name": "Get Last Emails",
  "type": "n8n-nodes-base.gmail",
  "typeVersion": 1,
  "position": [
    560,
    410
  ],
  "credentials": {
    "gmailOAuth2": "oauth-gmail"
  }
}
  ],
  "connections": {
"Start": {
  "main": [
    [
      {
        "node": "Get Last Emails",
        "type": "main",
        "index": 0
      }
    ]
  ]
},
"Cron": {
  "main": [
    [
      {
        "node": "Get Last Emails",
        "type": "main",
        "index": 0
      }
    ]
  ]
},
"Get Last Emails": {
  "main": [
    [
      {
        "node": "FunctionItem",
        "type": "main",
        "index": 0
      }
    ]
  ]
},
"FunctionItem": {
  "main": [
    [
      {
        "node": "AWS S3",
        "type": "main",
        "index": 0
      }
    ]
  ]
}
  },
  "active": false,
  "settings": {},
  "id": "20"
}

FINALLY was able to replicate the issue and fix it. Just sent a PR with the fix. We will let you know when is released. Thanks for making us aware of this issue.

https://github.com/n8n-io/n8n/pull/1011

Great! Thank you @RicardoE105!
This was blocking us in some tasks at our company :slight_smile:

Hi @RicardoE105,

I am not sure if this is related, but when I try this scenario

and change the last step to an s3, with same config (upload, binary data enabled and data property), flow freezes and only first file is uploaded to specified folder.

Could you confirm me, please?

Thanks!

After checking with new version 0.86.0, error was fixed.

Thanks!

Ah great! Just wanted to update here that the version got released.

1 Like

Yeah. I saw it this morning, deployed, tested and everything is perfect :slight_smile:

Thanks a lot guys! It allowed me to leave Integromat for this task :wink:

1 Like