Memory consumption in loop

Hello!

I have to process a lot of records from Postgres table in a loop, in quite long sequences step by step. I found, that while the flow processed this loop, the memory consumption grows dramatically in linear progression.
I moved nodes from the loop to different flow and called one using HTTP request. It is helped me to got less memory consumption.
This is the test flow, it is getting record by record from one table and insert it to other table.

This is memory consumption:

Is it normal behavior for loops processing?

Table size is: 1888 Kb
Record count is: 18409

n8n version: 1.41.0
database: postgres
running: docker
os: Debian GNU/Linux 9.6 (stretch)

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

The best way to solve this is to process batches in a subworkflow. You can read more memory consumption tips here.

1 Like

@bartv Thanks for your answer. I created more simple example consists of two workflows: HTTP request called in the loop and returned the same json.

I have got this memory consumption graph:

Is it normal way? Is I really need to create subflow instead of http request? Can I optimize memory consumption somehow else?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.