Error uploading to S3

Hi,

I get an error when I want to upload a zip file to an accounting S3 (Scaleway).

Do you have an idea?

Here is the error :

{

“message”: “Request failed with status code 500”,

“name”: “Error”,

“stack”: “Error: Request failed with status code 500 at createError (/usr/local/lib/node_modules/n8n/node_modules/axios/lib/core/createError.js:16:15) at settle (/usr/local/lib/node_modules/n8n/node_modules/axios/lib/core/settle.js:17:12) at IncomingMessage.handleStreamEnd (/usr/local/lib/node_modules/n8n/node_modules/axios/lib/adapters/http.js:269:11) at IncomingMessage.emit (events.js:327:22) at endReadableNT (internal/streams/readable.js:1327:12) at processTicksAndRejections (internal/process/task_queues.js:80:21)”

}

Just tested it with AWS S3 and it works fine for me. I wonder if the Scaleway API is different? Are you using the S3 node, or the HTTP request?

Hi,

I managed to upload a file but it is still empty.

What I am trying to do :
Google firestore → json → binarydata → zip compressed → S3

Can you share you workflow?

I tested it with the AWS S3 node (it shares the same code as the S3 node), and it worked fine.

So you can see the zip file in the bucket, but when you unzip it, there is no data there?

I only see one line from one of the two GCP firestore json, nothing more :confused:

Ahh, that is because the zip file is overridden. The one that you see is the last record firebase returned. You have to aggregate all items in a single array before compressing and finally uploading the file to S3. Check the example below. To adapt it to your workflow, connect the function aggregate data to the move binary data. Keep me posted.