How to avoid a race condition with parallel jobs started by a callback?

Hi there,

I’m dealing with a situation where I have an overactive callback being fired twice within milliseconds. In my workflow, I read from a DB to make sure I haven’t already performed the job, I complete the job, and then write to the DB that the job is done, which takes about a second. However, occasionally the second callback call reads the DB before the first has had a chance to write to the DB, causing the process to run twice.

I’ve added a Wait that delays the read for a random number of seconds, but obviously this isn’t a 100% solution. The ideal solution would be to force the job to run serially to other instances of that workflow. Any idea how to do that? (This feels like it should be a Workflow setting.)

1 Like


I had the same problem. I end up using RabbitMQ. I don’t know the ins and outs of RabbitMQ but it is very simple.

Setup a RabbitMQ docker container.

Create a workflow that sends data to rabbitmq and create a trigger rabbitmq node on the workflow you want to run sequentially.

Here is an example.

This is the workflow to send data to RabbitMQ

This is the workflow with the trigger rabbitmq

Activate the trigger in n8n and keep in mind that you have to be careful in RabbitMQ trigger to set the Parallel Message Processing Limit to 1

The way it works is, you start a job and send the message to rabbitmq, then rabbitmq trigger read the queue sequentially. If the job you start fire agains, the message goes to rabbitmq server. The n8n rabbitmq trigger node wait to finish the first message from the queue before running the next request from the message queue.


Brilliant. RabbitMQ solution is working like a charms. Thanks a lot :100:

1 Like

For anyone who is having trouble with this option, it’s at the bottom of the Options list, it is often hidden, and there is no scroll bar indicating the menu has more options. I felt pretty dumb when I finally found it (thanks to this post). Now RabbitMQ is doing exactly what I need it to - queueing dozens of incoming webhooks and processing them one at a time! It is trivial to implement - just start up a plain RabbitMQ container and use the same queue name in the sender and receiver, and it will “just work”.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.