Existing execution data is too large

Describe the problem/error/question

my workflow with chat trigger node, it works to accept 2-3 files, but when I attached 40 files at a time, it throws an error “Existing execution data is too large”.

Is it possible to solve this problem?

n8n version = 2.2.3

regards,

alan

I think the error happens because n8n’s default 16MB payload limit is too small for the total size of all files in a single execution. Increase the limit by adding N8N_PAYLOAD_SIZE_MAX=268435456 (256MB) to your environment variables in docker-compose.yml and restart.

Make sure that N8N_FORMDATA_FILE_SIZE_MAX is less than or equal to N8N_PAYLOAD_SIZE_MAX, since multipart file uploads are part of the same HTTP payload.

You can also split files early using the SplitInBatches node to process one file at a time instead of all at once. For file-heavy workflows, consider offloading binaries to external storage like S3, MinIO, or Ainoflow Files and pass only references through n8n to keep execution data lean.

1 Like

@Alan_Cheung1
clear session and start workflow.

Hey @Alan_Cheung1 i recommend running your n8n instance in a private window like incognito mode, and if it still fails then your data is exceeding limit of the n8n’s payload, so instead of fetching everything at once try fetching batches of files like 10-20 per iteration that would help you smoothly run your operation without causing any issue, hope this helps!

1 Like