Request failed with status code 413 - too large payload

Describe the problem/error/question

When I run my workflow by pressing “test workflow” the workflow runs fine, but when I then press execute on an individual node the workflow crashes.

I only get 769 items in the beginning, and the payload doesn’t seem large at all.

I recorded a loom on this problem: Loom | Free Screen & Video Recording Software | Loom

I have the problem on the cloud instance and on my selfhosted docker instance.

What am I doing wrong?

What is the error message (if any)?

Request failed with status code 413

Please share your workflow

I tried to add the workflow, but its too large too! :sweat_smile:

See Loom video

Share the output returned by the last node

Information on your n8n setup

  • n8n version: n8n@ai-beta
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Cloud and on my selfhosted docker the same result
  • Operating system: Windows

Hey @LinkedUp_Online,

On your self hosted install can you try setting N8N_PAYLOAD_SIZE_MAX to something like 32 and see if that helps? On cloud you may have to use smaller amounts of data as the default there is 16.

Hi @Jon Thanks, I made the changes and it works on my Docker instance now.

Is there a way to batch or filter in a beginning of your workflow to prevent this problem?

Or is it about the total payload in the workflow?

Last question, why is the workflow able to execute in one go, but will it crash on a “too large payload” after executing a single node. That just doesn’t make sense to me.

Hey @LinkedUp_Online,

You could try using a loop to batch the data into smaller chunks, Don’t forget 700 items that just contain 1 character would be a lot different to 700 items that contains multiple columns and multiple characters in each column so trying to base something on the number of items can be tricky as it is data size of those items that can cause issues.

When you run a single node we sometimes post all of the data along with it which we don’t do if you run the entire workflow as they both different operations, I think in the future this may change though.

Ok, good to know! Thanks for the explanation. :+1:

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.