Error uploading to S3

Hi,

I get an error when I want to upload a zip file to an accounting S3 (Scaleway).

Do you have an idea?

Here is the error :

{

“message”: “Request failed with status code 500”,

“name”: “Error”,

“stack”: “Error: Request failed with status code 500 at createError (/usr/local/lib/node_modules/n8n/node_modules/axios/lib/core/createError.js:16:15) at settle (/usr/local/lib/node_modules/n8n/node_modules/axios/lib/core/settle.js:17:12) at IncomingMessage.handleStreamEnd (/usr/local/lib/node_modules/n8n/node_modules/axios/lib/adapters/http.js:269:11) at IncomingMessage.emit (events.js:327:22) at endReadableNT (internal/streams/readable.js:1327:12) at processTicksAndRejections (internal/process/task_queues.js:80:21)”

}

Just tested it with AWS S3 and it works fine for me. I wonder if the Scaleway API is different? Are you using the S3 node, or the HTTP request?

Hi,

I managed to upload a file but it is still empty.

What I am trying to do :
Google firestore → json → binarydata → zip compressed → S3

Can you share you workflow?

{
  "name": "SAVE Firestore to staging",
  "nodes": [
    {
      "parameters": {},
      "name": "Start",
      "type": "n8n-nodes-base.start",
      "typeVersion": 1,
      "position": [
        250,
        300
      ]
    },
    {
      "parameters": {
        "resource": "collection",
        "projectId": "XXXX",
        "returnAll": true
      },
      "name": "Google Cloud Firestore",
      "type": "n8n-nodes-base.googleFirebaseCloudFirestore",
      "typeVersion": 1,
      "position": [
        600,
        300
      ],
      "credentials": {
        "googleFirebaseCloudFirestoreOAuth2Api": {
          "id": "13",
          "name": "Google Firebase Cloud Firestore account"
        }
      }
    },
    {
      "parameters": {
        "operation": "getAll",
        "projectId": "XXXX",
        "collection": "=room",
        "returnAll": true
      },
      "name": "GCP Firestore room check",
      "type": "n8n-nodes-base.googleFirebaseCloudFirestore",
      "typeVersion": 1,
      "position": [
        910,
        300
      ],
      "credentials": {
        "googleFirebaseCloudFirestoreOAuth2Api": {
          "id": "13",
          "name": "Google Firebase Cloud Firestore account"
        }
      }
    },
    {
      "parameters": {
        "projectId": "XXXX",
        "collection": "conversations-space",
        "documentId": "={{$json[\"conversationsSpaceId\"]}}"
      },
      "name": "GCP conversation-space check",
      "type": "n8n-nodes-base.googleFirebaseCloudFirestore",
      "typeVersion": 1,
      "position": [
        1230,
        300
      ],
      "credentials": {
        "googleFirebaseCloudFirestoreOAuth2Api": {
          "id": "13",
          "name": "Google Firebase Cloud Firestore account"
        }
      }
    },
    {
      "parameters": {
        "operation": "upload",
        "bucketName": "XXXX",
        "fileName": "={{$binary.data.fileName}}.zip",
        "additionalFields": {},
        "tagsUi": {
          "tagsValues": []
        }
      },
      "name": "S3",
      "type": "n8n-nodes-base.s3",
      "typeVersion": 1,
      "position": [
        1990,
        290
      ],
      "credentials": {
        "s3": {
          "id": "14",
          "name": "S3 account"
        }
      }
    },
    {
      "parameters": {
        "mode": "jsonToBinary",
        "options": {}
      },
      "name": "Move Binary Data",
      "type": "n8n-nodes-base.moveBinaryData",
      "typeVersion": 1,
      "position": [
        1240,
        70
      ]
    },
    {
      "parameters": {
        "operation": "compress",
        "outputFormat": "zip",
        "fileName": "={{$node[\"GCP Firestore room check\"].json[\"tournament\"][\"id\"]}}"
      },
      "name": "Compression",
      "type": "n8n-nodes-base.compression",
      "typeVersion": 1,
      "position": [
        1840,
        70
      ]
    },
    {
      "parameters": {
        "mode": "jsonToBinary",
        "options": {}
      },
      "name": "Move Binary Data1",
      "type": "n8n-nodes-base.moveBinaryData",
      "typeVersion": 1,
      "position": [
        1490,
        300
      ]
    }
  ],
  "connections": {
    "Start": {
      "main": [
        [
          {
            "node": "Google Cloud Firestore",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Google Cloud Firestore": {
      "main": [
        [
          {
            "node": "GCP Firestore room check",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "GCP Firestore room check": {
      "main": [
        [
          {
            "node": "GCP conversation-space check",
            "type": "main",
            "index": 0
          },
          {
            "node": "Move Binary Data",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "GCP conversation-space check": {
      "main": [
        [
          {
            "node": "Move Binary Data1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Move Binary Data": {
      "main": [
        [
          {
            "node": "Compression",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Compression": {
      "main": [
        [
          {
            "node": "S3",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Move Binary Data1": {
      "main": [
        [
          {
            "node": "Compression",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  },
  "active": false,
  "settings": {},
  "id": 27
}

I tested it with the AWS S3 node (it shares the same code as the S3 node), and it worked fine.

So you can see the zip file in the bucket, but when you unzip it, there is no data there?

I only see one line from one of the two GCP firestore json, nothing more :confused:

Ahh, that is because the zip file is overridden. The one that you see is the last record firebase returned. You have to aggregate all items in a single array before compressing and finally uploading the file to S3. Check the example below. To adapt it to your workflow, connect the function aggregate data to the move binary data. Keep me posted.

{
  "nodes": [
    {
      "parameters": {},
      "name": "Start",
      "type": "n8n-nodes-base.start",
      "typeVersion": 1,
      "position": [
        470,
        810
      ]
    },
    {
      "parameters": {
        "functionCode": "return [\n  {\n    json: {\n      age: 12\n    }\n  },\n    {\n    json: {\n      age: 13\n    }\n  }\n]"
      },
      "name": "Function",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        710,
        810
      ]
    },
    {
      "parameters": {
        "mode": "jsonToBinary",
        "options": {}
      },
      "name": "Move Binary Data2",
      "type": "n8n-nodes-base.moveBinaryData",
      "typeVersion": 1,
      "position": [
        1110,
        810
      ]
    },
    {
      "parameters": {
        "operation": "compress",
        "outputFormat": "zip",
        "fileName": "data.zip"
      },
      "name": "Compression1",
      "type": "n8n-nodes-base.compression",
      "typeVersion": 1,
      "position": [
        1340,
        810
      ]
    },
    {
      "parameters": {
        "functionCode": "let response = [];\n\nfor (const item of items) {\n  response.push(item.json)\n}\n\nreturn [\n  {\n    json: {\n      data: response\n    }\n  }\n]"
      },
      "name": "Function1",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        910,
        810
      ],
      "notesInFlow": true,
      "notes": "Aggregate data"
    },
    {
      "parameters": {
        "operation": "upload",
        "bucketName": "holapapa",
        "fileName": "={{$binary.data.fileName}}.zip",
        "additionalFields": {},
        "tagsUi": {
          "tagsValues": []
        }
      },
      "name": "S4",
      "type": "n8n-nodes-base.s3",
      "typeVersion": 1,
      "position": [
        1560,
        810
      ],
      "credentials": {
        "s3": {
          "id": "391",
          "name": "S3 account"
        }
      }
    }
  ],
  "connections": {
    "Start": {
      "main": [
        [
          {
            "node": "Function",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Function": {
      "main": [
        [
          {
            "node": "Function1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Move Binary Data2": {
      "main": [
        [
          {
            "node": "Compression1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Compression1": {
      "main": [
        [
          {
            "node": "S4",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Function1": {
      "main": [
        [
          {
            "node": "Move Binary Data2",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}