Memory usage Issue

I have a workflow which pull data from a url and since its paginated, http request made would be more than once. The workflow was working all fine, but today encountered with extreme memory usage while attempting to run the workflow, and even the n8n url doesn’t load and each time I have stop and start the service.

How can I solve this? Please help.

image

Information on your n8n setup

  • n8n version: 0.205
  • **Database (default: SQLite):**Postgres
  • **n8n EXECUTIONS_PROCESS setting (default: own, main):**main
  • **Running n8n via (Docker, npm, n8n cloud, desktop app):**Docker
  • **Operating system:**Windows

Hi @Meghna_jose :wave: Sorry to hear you’re running into this!

How much data are you pulling in through the HTTP request node, and is it making a lot of requests in a loop?

Could you move the HTTP request to its own sub-workflow? There’s also a few other tips here that you could take a peek at: Memory-related errors | n8n Docs

I’ve also noticed you’re on a much older version of n8n - there’s likely been a lot of changes that also may help out with this since 0.205.

Hi @EmeraldHerald ,
The HTTP request doesn’t pull data for large size, its just an array of objects with 50 datapoints and it loops maximum 3 times.
And the version update, I can’t update to latest because when I did it before, there was multiple issues with code node, the script which was working before had syntax issues and some functions were invalid, and also there are around 100 workflows which makes it difficult to redo it.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.