Cannot upload binary file of 3.6M Gzipped?

We have a custom Shopware 6 endpoint that accepts a gzipped CSV and does a bulk import.

Using curl it works, using n8n it works for smaller files (for example 1 KB of gzipped content), but when I try to upload the full 3650077 bytes gzip (3.6M) the Shopware endpoint reports, that the file multipart field is not found.

Is there a size limit on what can be transmitted as form data?

Interesting: in the node output it contains maxDataSize = 2097152 - where does this come from and does it mean it’s limited to 2MB?

On PHP / Server side the file input and post max size is > 64MB, so that should not be the problem.

It could be a problem though if n8n tries to unzip the .gz before posting?

What is the error message (if any)?

Missing import file. Upload a gezipped CSV as multipart form-data field “file”. (from the shopware service, because the file is not transmitted)

Please share your workflow

Share the output returned by the last node

{ “headers”: { “accept”: “application/json,text/html,application/xhtml+xml,application/xml,text/;q=0.9, image/;q=0.8, /;q=0.7”, “Authorization”: “hidden”, “content-type”: “multipart/form-data; boundary=--------------------------288f390314ab102b29643167” }, “method”: “POST”, “uri”: “https://example.com/api/custom-price-logic/prices/import-csv”, “gzip”: true, “rejectUnauthorized”: true, “followRedirect”: true, “resolveWithFullResponse”: true, “followAllRedirects”: true, “timeout”: 600000, “formData”: { “_overheadLength”: 189, “_valueLength”: 0, “_valuesToMeasure”: [ { “_events”: {}, “_readableState”: { “highWaterMark”: 16, “buffer”: , “bufferIndex”: 0, “length”: 0, “pipes”: , “awaitDrainWriters”: null }, “_eventsCount”: 2 } ], “writable”: false, “readable”: true, “dataSize”: 0, “maxDataSize”: 2097152, “pauseStreams”: true, “_released”: true, “_streams”: , “_currentStream”: null, “_insideLoop”: false, “_pendingNext”: false, “_boundary”: “--------------------------288f390314ab102b29643167”, “_events”: {}, “_eventsCount”: 3 }, “encoding”: null, “json”: false, “useStream”: true }

Information on your n8n setup

  • n8n version: 2.3.5
  • Database (default: SQLite): PostgreSQL
  • Running n8n via: docker
  • Operating system: Ubuntu

Hello @alexm ,

The underlying form-data library used by n8n has a default limit of 2MB for buffered data. Since your n8n instance is likely running in the default “memory” mode, it loads the 3.6MB file into a buffer, hits this library limit during the upload, and fails to send the body (causing the “Missing import file” error).

Try this :

• The best fix is to Switch to Filesystem Mode. It forces n8n to handle binary data as streams (pointers to disk) rather than memory buffers, which bypasses the 2MB limit.

  1. Open your Docker Compose or environment configuration.
  2. Add/Set N8N_DEFAULT_BINARY_DATA_MODE=filesystem.
  3. Restart the n8n container.

Re-run the import. The HTTP node should now stream the file regardless of size.

Let me know if that works.

Hi @alexm

Here’s your corrected reply:


Hi @alexm

The maxDataSize you’re seeing — 2097152 — equals exactly 2 MB. That’s a hardcoded default from the combined-stream library that n8n uses internally when building outbound multipart requests.

While @A_A4’s filesystem mode solution is valid and could help, it changes how n8n stores binary data across your entire instance — writing everything to disk, which you may not want. There’s also no guarantee it fixes your issue, since the HTTP Request node may still hit the 2 MB limit when it assembles the outgoing request body.

This is actually a known bug — GitHub issue #18271 shows that n8n’s httpRequest helper doesn’t properly handle FormData objects at all. Instead of streaming multipart data, it JSON.stringifys the whole thing and sends it as application/json. That’s exactly the maxDataSize: 2097152 JSON blob you’re seeing in your output — it’s not your file data, it’s the serialized FormData object.

If you’d rather not change how n8n stores data globally, you can use a Code node to bypass both the HTTP Request node and the FormData bug entirely. The trick is to manually build the multipart body as a raw Buffer — no external modules needed, no env vars to change:

javascript

// Get the binary buffer from the previous node
const binaryData = await this.helpers.getBinaryDataBuffer(0, 'data');

// Manually build the multipart/form-data body
const boundary = '----n8nFormBoundary' + Math.random().toString(36).substring(2);
const header = Buffer.from(
  `--${boundary}\r\n` +
  `Content-Disposition: form-data; name="file"; filename="import.csv.gz"\r\n` +
  `Content-Type: application/gzip\r\n\r\n`
);
const footer = Buffer.from(`\r\n--${boundary}--\r\n`);
const body = Buffer.concat([header, binaryData, footer]);

// Send it — raw Buffer bypasses the 2MB limit and the FormData bug
const response = await this.helpers.httpRequest({
  method: 'POST',
  url: 'https://example.com/api/custom-price-logic/prices/import-csv',
  body: body,
  headers: {
    'Content-Type': `multipart/form-data; boundary=${boundary}`,
    'Authorization': 'Bearer YOUR_TOKEN_HERE',
  },
  timeout: 600000,
});

return [{ json: response }];

A few things you’ll need to change:

  • Binary property name: Replace 'data' with whatever your binary property is actually named — check the output of the node before the Code node.

  • URL and auth: Swap in your real endpoint and auth token.

  • Filename/content type: Adjust 'import.csv.gz' and 'application/gzip' if your file is named differently or isn’t gzipped.

Why this works: this.helpers.httpRequest handles a raw Buffer body just fine (confirmed in the docs). By building the multipart body ourselves, we skip FormData entirely — no combined-stream, no 2 MB limit, no JSON serialization bug. Buffer and this.helpers.getBinaryDataBuffer are both natively available in the Code node, so you don’t need to mess with NODE_FUNCTION_ALLOW_EXTERNAL or install anything.

Obviously, @A_A4 is valid, but if you don’t want to change settings, this solution works too.

@alexm that maxDataSize: 2097152 (2MB) is indeed the problem, it’s a limit in the form-data library that n8n uses for multipart uploads.

You can solve this by creating a Code node before your HTTP Request that handles the upload with a higher limit:

const FormData = require('form-data');
const axios = require('axios');

// Get the binary file data
const binaryData = items[0].binary.file;
const fileBuffer = Buffer.from(binaryData.data, 'base64');

// Create form with increased maxDataSize
const form = new FormData();
form.append('file', fileBuffer, {
  filename: 'import.csv.gz',
  contentType: 'application/gzip'
});

// Increase the maxDataSize limit
form.maxDataSize = 10 * 1024 * 1024; // 10MB

// Make the request
const response = await axios.post(
  $vars.shopware_url + 'api/custom-price-logic/prices/import-csv',
  form,
  {
    headers: {
      ...form.getHeaders(),
      'Authorization': 'Bearer YOUR_TOKEN' // Use your OAuth token
    },
    maxBodyLength: Infinity,
    maxContentLength: Infinity,
    timeout: 600000
  }
);

return [{
  json: response.data
}];

Hope this helps!

1 Like

Thanks for the answers.

  • does settingN8N_DEFAULT_BINARY_DATA_MODE=filesystem have any drawbacks other than eventually performance implications?
  • @Orionpax is there a “simple” way to get the OAuth2 Bearer or do I have to fetch the token manually, too?

Yeah, the main non-performance drawback is disk usage accumulation.

If a workflow crashes hard or the instance restarts unexpectedly, n8n might fail to clean up the temporary files it created. Over time, these orphaned files can fill up your storage, especially if you’re processing large media.

Also, if you’re running in Docker, make sure the temp directory isn’t writing to the container’s writable layer. If it is, your container size will balloon until it crashes the daemon.

Actually, since v2, there is no memory mode anymore, but you may still have it from the ENV variable.
n8n v2.0 breaking changes | n8n Docs

And the HTTP node is properly uploading big files via form data (just checked with a 40MB file) in file system mode.

That is interesting, so my problem must come from somewhere else and the maxDataSize is just confusing me :slight_smile:

You may try setting the file system mode explicitly with the
N8N_DEFAULT_BINARY_DATA_MODE=filesystem as it seems your HTTP node was using the memory mode (there are no FS signs in the request, as the property formData._valuesToMeasure doesn’t have any path properties). Or maybe there is something with the API and you will need to specify some extra formData fields

Off-Topic: Did you just flag your own AI analyzer post as AI generated?

:joy::joy: that’s an honest AI fr ;D

2 Likes

Well sometimes I use AI to help format my reply’s so I’m not suprised it flagged me

I had to turn it off since bartv asked me 2

I did deeper debugging.

  • It tried with mockbin.io, here even bigger fields appear, so it’s not an isolated n8n problem
  • I tried smaller files to send to PHP → it appears in $_FILES .. most of he time
    • that’s where the fun starts: With a very small file even sometimes the files does not arrive

I used some debug script and found out then around in 10% of the cases, n8n sends even a small file as “chunked” which makes PHP choke and does not receive the file.

When the file is bigger, n8n always uses “chunked” so the file does not arrive.

As I have control over the endpoint, I will probably change it so that it can accept / stream files from chunked transfers.

1 Like

How did you try the curl?

Was it like curl -d "@path" or curl --data-binary "@path"?
The first option is a memory mode (all file content will be stored in memory and then will be uploaded)
The second option uses file streaming (a more general approach for uploading files).

So yeah, the endpoint should be able to work with chunks

I tried different things in curl :slight_smile: Like form data …. yeah, it works now .. I changed the endpoint to accept the gzip as body - no more form data.

Thanks everybody for your help!

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.