Problem running workflow
Please execute the whole workflow, rather than just the node. (Existing execution data is too large.)
Hey!
Are you self hosted? IF yes:
Add this environment variable to your Docker Compose or .env file to increase the allowed workflow memory:
environment: - N8N_PAYLOAD_SIZE_MAX=268435456 # 256MB (recommended)
# Or adjust based on your needs:
# - N8N_PAYLOAD_SIZE_MAX=67108864 # 64MB
# - N8N_PAYLOAD_SIZE_MAX=536870912 # 512MB
Alternatively:
-
Execute Full Workflow Instead
Click “Execute Workflow” instead of testing individual nodes -
Reduce Data Size
- Split Large Workflows:
- Break complex workflows into sub-workflows
- Process data in smaller batches
- Use SplitInBatches node for large datasets
- Optimize Data Flow:
- Remove unnecessary data between nodes
- Use Set node to select only required fields
- Process files externally when possible
Please mark as Solution if any of this helped ![]()
1 Like
Thanks for your effort,
But i can already solved.
1 Like
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.