Request failed with status code 413

Hi,

i try to do some ETL processes in a workflow, my first run with 1000 rows runs smoothly and fine.
After that i changed the Limit in the SQL Statement and run the same Workflow with (46000 rows) and the workflow generating following error:

Problem running workflow

There was a problem running the workflow:
Request failed with status code 413

My Workflow is only a SQL Statement and a Functionsitem Block with one Javascript Regex Function.

Maybe anyone know to increase the Upload Limit?

Welcome to the community @Basti!

Status code 413 means Payload Too Large. So 46k rows are apparently simply too much data. In this case you would have to split it into multiple runs with fewer rows per execution.

Hi Jan,

thank you for your fast reply!
I limited the Statement to 10k the workflow generating now following error :

There was a problem executing the workflow:
"Workflow execution process did crash for an unknown reason!"

If the rowcount is under 8k it works, thanks :slight_smile:

So the 413 error is a webserver error? any ideas to increase the limit? for nginx a client_max_body_size parameter exists to change that.

Sorry actually do not know where that error comes from. If from n8n (so Node.js) or the reverse proxy. But considering that apparently n8n crashes even with 10k rows, as it probably runs out of memory, does increasing the request limit not really help anyway.

Okay thank you very much for your effort :slight_smile:

Sure, have fun!

Hey, I’m already having 3 items per row. The thing is that some contain binary data (pdf attachment) that may be a couple of KB big. But I’m executing anode that is not reading those fields, do the still count? Do I need to remove those fields in the pipeline?

Hi @danielo515, I am sorry you’re having trouble. This is a rather old topic that’s already marked as solved. Going forward you might want to open a new thread in such cases, as it’s easy to miss follow up comments here (and also because n8n changes a lot over the course of multiple years).

That said, the answer to your question is (most likely, seeing I don’t know your workflow) yes. Even when not reading (or otherwise processing) a specific field, n8n would keep this data (and send it to the UI when executing your workflow manually). If you are currently seeing 413 responses you would want to adjust the N8N_PAYLOAD_SIZE_MAX environment variable in n8n accordingly and double-check if your reverse proxy (if you are using one) handles the respective payload sizes as well like Jan suggested above.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.