I’m building this simple workflow to update a big google sheets, but the Google Sheets node seems to load all the data on update, which crash n8n even when working one line at a time.
For now I will increase my memory limit but I suppose this is not intended.
It also crashes when I use “Return first raw only”, I need to manually handle it with data ranges.
What is the error message (if any)?
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
Please share your workflow
Share the output returned by the last node
Information on your n8n setup
n8n version: 1.102.3
Database (default: SQLite): postgres
n8n EXECUTIONS_PROCESS setting (default: own, main): main
Running n8n via (Docker, npm, n8n cloud, desktop app): docker
Hi Moosa thank you but it’s not relevant to my point I’ve already increased memory as mentionned. But the gsheet node is still sub-optimal by itself, we shouldn’t need to load the whole doc into memory to update a single raw
the problem is in Get row(s) in sheet3 node.
you have defined Range A1:AL{{ ($json.i+1)*$json.batch_size+$json.offset }} which loads all the records from A1.
you need to narrow down this range filter to your specific record.