im trying to update some information via API request, after the request, i got about 12.5k itens in the node.
To reduce this data amount i’ve started using MariaDB to store the data amount.
So when i need data, i make a select to retrive those items, make a loop for send via another http request and after store a log in a table of mariadb
But, my workflow stops, in SQL executions sometimes they show me memory error and sometimes memory heap.
I tried separate in multiple workflows, flow after flow. for my surprise if i start flow after flow manualy, sometimes works.
With schedule node or another trigger, dont work.
How can i set more memory to my node using npm + pm2 ?
Its possible use Multithreads of my server for one worker?
Theres a possibility of mariadb(MySql) node have a bug?
Without seeing the workflows its hard to say where the bottleneck is.
Generally, you should have such a design:
workflow1 to fetch api and place results in the DB
workflow2 to set the paging properties for the sub workflow3 (start page, next page and so on)
workflow3 (sub for wf2) that will receive the page properties from the parent workflow2, selects the desired page in the db (e.g. receive 20 items with a single query) and work with the results. In the end that wf should pass the next page properties to the parent wf.
That looks quite complicated and you may think why you need it instead of using loops, but the goal here is to perform a “hard” operations for each page (or batch) separately, so the n8n will be able to release the memory after the each execution finishes (i.e. after each portion of results will be processed)