Out of Memory

Actually, I am not sure it is the way to go, I am just saying that is the way I went when I created the first version, and no matter if it was a good or bad decision we all are kind of stuck with it now :wink:

That is very strange. Can not think of a reason why it should crash for memory reasons when there is actually still memory left. The only thing I can imagine is happening that it is maybe at 2GB memory usage the most time but there is then a spike (no matter how short) which makes it run out of memory.

There are currently multiple knows such temporary memory spikes. A small one happens for example after every executed node when running a workflow manually. Then it creates a copy of the data (as it is stringifying it) to send it to the frontend. Another very large one happens at the time of the execution as there it stringifies the whole execution data to save it in the database.

So not sure how you run the workflow and where it crashes but it could be worth a try triggering it via a Webhook-Node instead of manually and disabling saving successful workflow executions.

Totally on board there! I also hope we are able to improve that soon. @netroy spends currently a lot of his time on exactly that and already made a lot of progress.

Hi @jan, yes I am running the workflow all via webhook or cron trigger, manual execution over 20 nodes is always sending me to valhalla :sweat_smile:. All my tests are run on one instance with only a single test workflow running. I also thought that there could be spikes, but system and instance logs do not show that. the debug-memory-image from @netroy was very helpful I suppose, because it showed exactly how much memory was taken and how much could be cleaned. Let me know how if I can help to solve it.

1 Like

I only get his when handing massive amount of html data like stupid MBs worth in a flow. Other than that I don’t get the issue often. It gets to about 6gb of memory then dies in the browser.

So I am using ChatGPT to refine my flows. Slowly working :stuck_out_tongue_closed_eyes::grin:

It’s actually optimised some of my puppeteer script pretty well tbh

Yeah ChatGPT can do some work on that if it gets right :wink:

Executing anything via manual execution in the Browser is only working on smaller workflows and there is a major delay between what status is visible in the UI compared to where the workflow actually is. Sometimes the animations on my side only finish 10-15 minutes after the workflow already executed completely.

I suppose on the browser part there is little the n8n team can do, but executing stupid MB amounts of HTML Data should work without getting any crash like this

of course only limited by your machines ressources and only without manual execution.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.