MaxListenersExceededWarning in queue mode

Describe the problem/error/question

Hello, I have noticed this error showing up in my “main” instance and it did happen a few times today:

During that time for a few minutes I wasn’t able to edit a workflow in the editor (and who knows if my “main” processor which handles webhook reception was also unrachable…

Seems that the default limit for Event Emitter is 10… I’m not sure if I have to change the value manually or it changes automatically in N8N… but my current instance has 1 main + 5 workers (10 concurrency each one):

About the workers, I have to say everything has been working great, I can see each worker logs and they seem to be working and processing data, but I’m not actually sure if I have more than 10 processments at once because of that EventLimitter thing…?

What is the error message (if any)?

(node:1) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 global:completed listeners added to [Queue]. Use emitter.setMaxListeners() to increase limit

(node:1) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 message listeners added to [WebSocketPush]. Use emitter.setMaxListeners() to increase limit

Information on your n8n setup

  • n8n version: 1.18.0
  • Database (default: SQLite): Postgresql
  • n8n EXECUTIONS_PROCESS setting (default: own, main): Queue
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker image
  • Operating system: Railway

Still seeing this error in the logs:

Not sure what to do…

I checked the following posts but doesn’t seem to help with my problem:

What exactly means memory leak detected? My instance has 32GB ram… Does this error causing main dropping webhooks? It’s just a warning that I can ignore?

Hi @yukyo, I am sorry you are having trouble. I’ve seen a few of these warnings myself, but never noticed any actual side effects so didn’t pay much attention. I definitely didn’t run into any of the editor problems you have described:

During that time for a few minutes I wasn’t able to edit a workflow in the editor (and who knows if my “main” processor which handles webhook reception was also unrachable…

Can you confirm when exactly this situation is happening? As in, was your main instance server under a high load (with CPU and/or memory consumption reaching 100%)?

With regards to n8n being unable to process incoming webhooks you might want to review your HTTP access logs for response times and codes. If you notice n8n struggling with processing your webhooks you might want to consider adding additional webhook processors behind a load balancer. This is documented on Configuring queue mode | n8n Docs

Hi, @MutedJam, in the last few days haven’t seen the error, and I didn’t have any impact on the performance or usability. My instance is not under a high load; I have allocated 64GB RAM and 32CPU for the instance where my n8n is running; it only went up to 5% of capacity…

Interestingly, the same day I had this issue, I updated to 1.18.0, which seems really buggy, and it was what caused the editor issues that “Connection lost” and couldn’t have changes in workflows.

To confirm, I returned to 1.16.0; the editor worked smoothly, with no network errors. Everything was fine.

Updated again to 1.18.0, then suddenly timeout connections in the editor, network, can’t save, etc.

I downgraded again back to 1.16.0, which I’m running now…

It would be interesting to find the root cause of this as it doesn’t seem to be causing any issues right now, but… with a higher load, it could start happening weirdly.

One last thing: it’s hard for me to track errors in my logs because 99,99999% of the errors are this:

I already reported it, but it seems it’s still not being fixed; I know it’s not affecting the n8n itself, but it’s tough to debug or search for logs when that log shows up every mili second up to 20 times.

Thanks.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.

New version [email protected] got released which includes the GitHub PR 10077.