Hi, I build a workflow to get all posts on a forum via an open API, and post them into a PostgreSQL database. The workflow quickly throws a Javascript out of memory error.
The API does not require any credential in case you want to test it
The main workflow only uses the Endpoint cursors as data
The subworkflow get / process / store all the data (and should clear memomy for the next batch)
The subworkflow handle data by batch of 25
For information our server has 10G of RAM allocated, the workflow seems to crash after around 7300 lines processed.
Hi @Eliott_Audry, I am so sorry for this. I tried manually running your workflow and was indeed able to see my memory consumption creeping up continuously stronger than expected, even when not using queue mode. I suspect this could be related to the Code node. Can you try adding a Set node at the end of your sub-workflow and verify if this reduces the memory consumption for you?
Something like this:
I am currently on iteration 328 with the memory usage being fairly consistent at around 480 MB:
To be honest, I don’t fully understand this one either.
Glad to see it’s a “not just me” situation though, I shall add this possible memory leak to our bug tracker for a closer look and fix by the engineering team