I have a workflow that should loop ± 170 times over the following nodes:
Get HTTP Request Results > Convert Results into desired format > Insert Results into MSSQL Server
The reason that it loops 170 times is that the API only allows me to get 100 records at a time and the data contains about 17.000 records.
The scrips works fine up until repetition nr. 80-85. Then the workflow keeps loading forever:
So I ran the workflow again while monitoring the memory usage using ‘docker stats’.
This is the status at loop nr 85 when the workflow stops working:
As you can see the memory usage is very high, 2+ GB for ‘n8n_n8n_1’, which normally uses around 140 MB idle state.
The database-table takes about 5 MB of space when loaded into SQL Server, so it is not a huge table with lots of values and columns. What could I do to solve this problem?