Google Sheets Update uses too much memory

Describe the problem/error/question

Hey !

I’m building this simple workflow to update a big google sheets, but the Google Sheets node seems to load all the data on update, which crash n8n even when working one line at a time.

For now I will increase my memory limit but I suppose this is not intended.

It also crashes when I use “Return first raw only”, I need to manually handle it with data ranges.

What is the error message (if any)?

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory

Please share your workflow

Share the output returned by the last node

Information on your n8n setup

  • n8n version: 1.102.3
  • Database (default: SQLite): postgres
  • n8n EXECUTIONS_PROCESS setting (default: own, main): main
  • Running n8n via (Docker, npm, n8n cloud, desktop app): docker

hey @AntoineDsh hope you are doing well,
these issues are addressed in documentation
causes: Memory-related errors | n8n Docs
solutions:Memory-related errors | n8n Docs

Hi Moosa thank you but it’s not relevant to my point I’ve already increased memory as mentionned. But the gsheet node is still sub-optimal by itself, we shouldn’t need to load the whole doc into memory to update a single raw

So problem is when you add record into Google sheets. It loads whole sheet instead of adding just that record?

Exactly, that’s a loss in memory and probably in execution time

the problem is in Get row(s) in sheet3 node.
you have defined Range A1:AL{{ ($json.i+1)*$json.batch_size+$json.offset }} which loads all the records from A1.
you need to narrow down this range filter to your specific record.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.