We have a custom Shopware 6 endpoint that accepts a gzipped CSV and does a bulk import.
Using curl it works, using n8n it works for smaller files (for example 1 KB of gzipped content), but when I try to upload the full 3650077 bytes gzip (3.6M) the Shopware endpoint reports, that the file multipart field is not found.
Is there a size limit on what can be transmitted as form data?
Interesting: in the node output it contains maxDataSize = 2097152 - where does this come from and does it mean it’s limited to 2MB?
On PHP / Server side the file input and post max size is > 64MB, so that should not be the problem.
It could be a problem though if n8n tries to unzip the .gz before posting?
What is the error message (if any)?
Missing import file. Upload a gezipped CSV as multipart form-data field “file”. (from the shopware service, because the file is not transmitted)
Please share your workflow
Share the output returned by the last node
{ “headers”: { “accept”: “application/json,text/html,application/xhtml+xml,application/xml,text/;q=0.9, image/;q=0.8, /;q=0.7”, “Authorization”: “hidden”, “content-type”: “multipart/form-data; boundary=--------------------------288f390314ab102b29643167” }, “method”: “POST”, “uri”: “https://example.com/api/custom-price-logic/prices/import-csv”, “gzip”: true, “rejectUnauthorized”: true, “followRedirect”: true, “resolveWithFullResponse”: true, “followAllRedirects”: true, “timeout”: 600000, “formData”: { “_overheadLength”: 189, “_valueLength”: 0, “_valuesToMeasure”: [ { “_events”: {}, “_readableState”: { “highWaterMark”: 16, “buffer”: , “bufferIndex”: 0, “length”: 0, “pipes”: , “awaitDrainWriters”: null }, “_eventsCount”: 2 } ], “writable”: false, “readable”: true, “dataSize”: 0, “maxDataSize”: 2097152, “pauseStreams”: true, “_released”: true, “_streams”: , “_currentStream”: null, “_insideLoop”: false, “_pendingNext”: false, “_boundary”: “--------------------------288f390314ab102b29643167”, “_events”: {}, “_eventsCount”: 3 }, “encoding”: null, “json”: false, “useStream”: true }
Information on your n8n setup
- n8n version: 2.3.5
- Database (default: SQLite): PostgreSQL
- Running n8n via: docker
- Operating system: Ubuntu

