Scaling webhook requests

I’m currently studying how to use n8n as a backend for a SaaS application.

I’ve built a RESTful API using n8n, and I’m receiving incoming requests through a webhook.

My current setup receives all requests in a single webhook, which then directs each request to the appropriate endpoint.

One potential issue with this approach is the risk of a webhook bottleneck since all incoming requests hit the same webhook initially.

I think that using queue mode with separate instances for main, workers, and webhooks, and Redis managing the queue, would solve this potential bottleneck, right? Should I still be concerned about a bottleneck at the webhook level?

Would it be more efficient to use a separate webhook for each endpoint instead?

.

Information on your n8n setup

  • n8n version: 1.84.1
  • Database (default: SQLite): Postgres
  • n8n EXECUTIONS_PROCESS setting (default: own, main): default
  • Running n8n via (Docker, npm, n8n cloud, desktop app): docker
  • Operating system: Ubuntu 24.04
1 Like

Remember that node.js handles web requests (and everything else) in a single-threaded, non-blocking way. The way n8n handles and queues workflow triggers may also be relevant, but there is already kind-of a bottleneck, and already mitigation for that (promises/async) in node.js.

I would only worry about it after you run a test at the concurrency/load level you expect, and observe an actual problem.

If you aren’t already familiar with it, this gives a reasonably good explanation of the node.js “event loop.”

Awesome. So this might not even be an issue.
Scaling n8n by using multiple workers and webhook instances would probably be enough, right?

It all depends.
Scaling with webhook workers and such should do the trick in most cases. But of course it depends on your servers and the resources available if it can process everything intime.
Best is to offload work to a queue like rabbitmq where you can to clear some of the resources faster.

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.