Issues with handling MB‑sized data in self‑hosted n8n CE

I’m running a self‑hosted n8n Community Edition and facing problems when workflows process MB‑sized data (e.g., JSON or CSV files). The execution often times out, fails, or becomes unresponsive.

Setup:

  • n8n CE (self‑hosted)

Looking for advice on:

  1. Best practices for handling larger payloads in n8n CE

  2. Whether streaming/chunking is recommended over passing full data through nodes

  3. Any configuration tweaks (memory, timeout, etc.) that can help

Thanks for your guidance!

hello @cyberengg_automation

Add more resources (RAM, DISK), or optimize your workflow :slight_smile:

No loop nodes, reduce Code nodes, and split the workflow into smaller ones. Use pagination whenever possible.

Keep in mind that n8n keeps all execution data while it is being executed. It releases the memory only when an execution finishes. That’s also valid for sub workflows; however, they are being processed as a separate execution.

Better to handle all “heavy” things in the sub workflow to keep the execution of the master workflow small.