Increase memory, decrease memory usage, and where to find usage/available memory for tasks

Describe the problem/error/question

I have 4 similar workflows that transfer data between google sheets. The 2 with smaller input files work fine. The 2 with larger input files have memory errors. I am wondering what I can do to increase memory and/or decrease usage? I have already put the memory-intensive step into a sub workflow. The single step inside the sub workflow is still too memory-intensive. Any suggestions?

What is the error message (if any)?

Out of memory related errors.

Please share your workflow

Schedule Trigger → Execute Workflow(Google Sheets-Read Sheet) → Split in Batches-5000rows(Wait-3sec → Google Sheet-Append Rows) → DONE

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Have similar workflows with smaller google sheet inputs. Output is as expected with those.

Information on your n8n setup

  • **n8n version:**0.231.2
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app): n8n cloud
  • Operating system:
1 Like

Hi @eaferstl Looks like you have missed this.

1 Like

Not sure if I follow. But if you have simply a Trigger node and then call an Execute Workflow node which calls a workflow that does all the work, it will not make a difference at all. After all, is still all the work done in a single workflow execution. You actually have to distribute the work between different workflow execution.

For example:

  • have an outer loop with the primary information (for example, 1000 IDs)
  • loop over them with a SplitInBatches Node that always just outputs 10 of those IDs
  • provide these IDs to a separate sub-workflow via the Execute Workflow node
  • that sub-workflow then takes those IDs and does the actual work (for example, query the data behind the IDs, do some transformation, and write it somewhere else)
  • make sure the sub-workflow does return no data (only an empty JSON) else the data will leak back into the main workflow, and the advantage is lost again

Only if you do it like that will sub-workflows actually help to reduce the memory footprint.
I hope that makes sense.

2 Likes

Hi Jan,
That is how I have it set up. The sub-workflow does the heavy lifting. However the heavy lifting in this case does return a JSON file. So it sounds like there is no solution then? I simply have a file that is too large for n8n? How can I proceed?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.