Worker processes question

Hello!

I have a workflow that receives a sheet with a lot of lines (each line correspond to a lead, with their informations). Each line get processed one at a time, but it take a long time (1,5s per line). You have a solution to receive the sheet, and create some workers to run multiple batches at a time?

PS: The sheet is received from upload in Amazon S3, using the Amazon SNS Trigger to listen for a upload.

Edit: If I create 2 instances of n8n, and use the first to collect the sheet and split them, so send the batches via API, without wait for a response, and receive the data in the second instance, controlled by the queue mode (to limit the number of simultaneous executions), its may work?

Thanks!

Information on your n8n setup

  • n8n version: 0.210.2
  • Database you’re using: PostgreSQL
  • Running n8n via Docker

Welcome to the community @maycon ! Great to have you with us!

Yes, sounds about right. You can configure the workers (via the concurrency flag) to run multiple workflows in parallel. By default it is set to 10. How many it is able to run without issues depends on the workflows you are running and the specs of the machine the worker is running on.

But there should be no need to have two totally separate n8n instances for that. One in queue mode should be enough. After all can you also start multiple workflows on the same instance by calling a Webhook-Node via the HTTP Request one.

But if you are not in a rush (so if it does not matter if it processes them within 5 minutes or 5h) there are also simpler ways. For example, sending them all to RabbitMQ and configuring the trigger node to only process X requests in parallel.

1 Like