Hello to all,
I discovered n8n a short time ago, and I immediately saw my interest in this product.
I regularly have automatic tasks to perform like syncing data from various sources to multiple databases.
So far I have built quite complex workflows (with code) to do this work.
But now I’m stuck! Not because of the logic of the workflow to implement but because of the memory used.
Let me explain, I have to read (every day) a CSV file of 500 000 lines that I have to integrate in a database.
Doing it in one pass was not very successful.
My plan “B” was to split this file into several files with fewer lines. This list of files is then injected into the SplitInBatches component which is responsible for sending the file name to another workflow (containing the data import).
In the logic everything works fine except that as the loops go on the memory load keeps increasing. I thought that the RAM would be freed up as we go along, but no.
Have you done a similar case, which would be my plan “C”?
Thank you all for reading