Recommandations to use less memory during a workflow

Hello,

My workflow is processing a lot of data (parsing web pages with 4Mb of data each) and is always crashing with “FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory”.
I’m self-hosting N8N on my Synology with 2Gb RAM.

I’ve seen somewhere to use splitInBatches node to free some RAM during the process.
Considering my worflow, do you see any recommandation to avoid this error?

This is the first part of the workflow (the rest is not written yet, but I’ll put the data in a supabase database and a qdrant vector).

Information on your n8n setup

  • n8n version: 1.95.2
  • Database (default: SQLite): SQLite
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
  • Operating system: Linux/DSM

hello @Eaglewatch

Move the Loop node functionality to the sub-workflow, as it causes the high memory drain

1 Like

Hello,

Thank you, this is working like a charm.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.