Execute workflow instances one at a time (queue) vs concurrently

I have a workflow that forwards messages from WhatsApp to Slack channel and is grouping discussions with different customers using Slack threads.

The problem I see often is that when messages come quickly one after another it seems order is not guaranteed - I can post Msg1, Msg2 to WhatsApp and get Msg2, Msg1 in Slack.

The logic in workflow is that while posting Msg 2 it will search for existing messages from same person and create a thread in Slack if needed. It also often fails because it seems that Slack search is not returning updated data quickly enough and when Msg1 and Msg2 are posted at same time concurrently, this doesn’t work.

So question is - can I force all executions of a same workflow to be executed sequentially and not concurrently, ideally I would even like to throttle these executions and have a guaranteed delay 1 sec between them.

n8n version: latest cloud beta

1 Like

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:
  • n8n version: Cloud 1.64
  • n8n EXECUTIONS_PROCESS setting (default: own, main): N/A

hello @Igor_77

n8n doesn’t have a built in methods for implementing a queue mechanism. You can use a 3rd party message brokers like RabbitMQ or create a custom logic with workflow static data

Thanks for suggestion, I’ve checked about static data, since is is saved only when workflow is completed and not instantly, it will not solve my issue.
I think I need a kind of mutex here, I will mark that specific phone number is already chatting right at workflow start and will check this flag later on in workflow and probably just re-do previous message search if not found or something lake that. It’s a pity some simple storage eg redis is not part of app but I probably can find something free in a cloud for this.