PayloadTooLargeError: request entity too large for 100kb request

Hello I am using webhook as a trigger, but whenever I post something larger than 100kb to the endpoint, it always return below error message, the http status code is 413 PayloadTooLarge. Not sure if there is anything wrong in setting or config.

PayloadTooLargeError: request entity too large
at readStream (/usr/local/lib/node_modules/n8n/node_modules/raw-body/index.js:155:17)
at getRawBody (/usr/local/lib/node_modules/n8n/node_modules/raw-body/index.js:108:12)
at read (/usr/local/lib/node_modules/n8n/node_modules/body-parser/lib/read.js:77:3)
at urlencodedParser (/usr/local/lib/node_modules/n8n/node_modules/body-parser/lib/types/urlencoded.js:116:5)
at Layer.handle [as handle_request] (/usr/local/lib/node_modules/n8n/node_modules/express/lib/router/layer.js:95:5)
at trim_prefix (/usr/local/lib/node_modules/n8n/node_modules/express/lib/router/index.js:317:13)
at /usr/local/lib/node_modules/n8n/node_modules/express/lib/router/index.js:284:7
at Function.process_params (/usr/local/lib/node_modules/n8n/node_modules/express/lib/router/index.js:335:12)
at next (/usr/local/lib/node_modules/n8n/node_modules/express/lib/router/index.js:275:10)
at /usr/local/lib/node_modules/n8n/node_modules/connect-history-api-fallback/lib/index.js:18:14
at Layer.handle [as handle_request] (/usr/local/lib/node_modules/n8n/node_modules/express/lib/router/layer.js:95:5)
at trim_prefix (/usr/local/lib/node_modules/n8n/node_modules/express/lib/router/index.js:317:13)
at /usr/local/lib/node_modules/n8n/node_modules/express/lib/router/index.js:284:7
at Function.process_params (/usr/local/lib/node_modules/n8n/node_modules/express/lib/router/index.js:335:12)
at next (/usr/local/lib/node_modules/n8n/node_modules/express/lib/router/index.js:275:10)
at textParser (/usr/local/lib/node_modules/n8n/node_modules/body-parser/lib/types/text.js:78:7)
at Layer.handle [as handle_request] (/usr/local/lib/node_modules/n8n/node_modules/express/lib/router/layer.js:95:5)
at trim_prefix (/usr/local/lib/node_modules/n8n/node_modules/express/lib/router/index.js:317:13)
at /usr/local/lib/node_modules/n8n/node_modules/express/lib/router/index.js:284:7
at Function.process_params (/usr/local/lib/node_modules/n8n/node_modules/express/lib/router/index.js:335:12)
at next (/usr/local/lib/node_modules/n8n/node_modules/express/lib/router/index.js:275:10)
at /usr/local/lib/node_modules/n8n/node_modules/body-parser-xml/index.js:27:51

Hi @KM19,

How do you have n8n setup?

I had something similar although it was 1mb when I first set up n8n behind an nginx proxy I had to set the client_max_body_size option.

I install it using docker, following steps from below url

also tried to run it locally using docker as well. Cant post something >= 100kb to webhook endpoint

Hm, that is very strange. Because n8n is by default setup to allow up to 16MB and that can even be increased further by setting the environment variable N8N_PAYLOAD_SIZE_MAX to a larger value than 16.

So I understand you correctly that this even happens if you just start the n8n container locally and send data directly to it (so to port 5678)? Looking at the error message above does it seem to be the case.

yes I am aware about N8N_PAYLOAD_SIZE_MAX as I did some research, tried to enter a larger number but still cant solve the problem

I tried 2 ways

  1. deploy the docker in AWS ec2, with domain mapped, and encounter the error when I post a request >100kb to webhook endpoint using postman
  2. run the docker in my macbook locally with port 5678, still have the same error

Thanks for confirming. We will have a deeper look.

Thanks for making us aware of this issue. We could now reproduce it and fix. It will be released with the next version in the next days.

We will build a new nightly version very soon which will include that fix. Will then update here.

3 Likes

You should now be able to use the nightly docker build n8nio/n8n:nightly which includes the fix.

1 Like

Got released with [email protected]

2 Likes

Hello,

I’m having a similar issue with Docker on Ubuntu using a S3 node. If I try to upload with a smaller file it works fine.
When I try with a file 13+ MB it crashes. It crashes memory too. I was using 1GB instance and upgraded to 2GB without success.

Hey @lero, I am sorry to hear you’re having trouble. Have you seen this post?

It contains some general pointers for dealing with larger files in n8n.

If these don’t help it might be worth opening a new topic with the exact steps required to reproduce your problem.

Thank you @MutedJam!
I had set N8N_PAYLOAD_SIZE_MAX but wasn’t aware about the N8N_DEFAULT_BINARY_DATA_MODE.
In the end I’ve built a workaround similar to what N8N_DEFAULT_BINARY_DATA_MODE does, but I think I’ve found a bug in a Edit Image node.
I’ll create another thread reporting it.

hello everyone same problem But how to set “N8N_PAYLOAD_SIZE_MAX” if n8n is installed via npm?

Hi @FIRE_TIKTOK, this depends a bit on how you run n8n (or other applications), not so much on how you have installed it.

Assuming you are using Linux (or MacOS, though I can’t test this myself), have installed n8n globally and run n8n manually, simply putting your environment variable before your n8n command would do the job. For example:

N8N_PAYLOAD_SIZE_MAX=1024 n8n

image

Hope this helps!