Is it possible to run a part of the workflow in parallel?

Describe the problem/error/question

I have a workflow that receives two pieces of information in a trigger form and then sends each data in separate HTTP requests, in this step if the two requests occur in parallel, it’ll be great, so if it’s possible, how can I run these two nodes in parallel?

Please share your workflow

*This is just the main idea of the workflow cited earlier, in the real workflow “Process Info1” and “Process Info2” would be HTTP requests that should run in parallel.

Information on your n8n setup

  • n8n version: 1.63.4
  • Running n8n via n8n cloud

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Welcome to the community @POC_Most !

Tip for sharing information

Pasting your n8n workflow


Ensure to copy your n8n workflow and paste it in the code block, that is in between the pairs of triple backticks, which also could be achieved by clicking </> (preformatted text) in the editor and pasting in your workflow.

```
<your workflow>
```

That implies to any JSON output you would like to share with us.

Make sure that you have removed any sensitive information from your workflow and include dummy or pinned data with it!


Switch the workflow execution order to the legacy v0 in the workflow settings.

The difference is

  • v0 (legacy) executes the first node of each branch, then the second node of each branch, and so on.
  • v1 (recommended) executes each branch in turn, completing one branch before starting another. n8n orders the branches based on their position on the canvas, from topmost to bottommost. If two branches are at the same height, the leftmost branch executes first.
1 Like

I managed to get parallel behavior using sub-workflows, and then wait for everything to finish before proceeding. It’s a little more involved to set it up because you have to use HTTP Request nodes, Webhooks, and wait with callback. In other words, to make it worth the trouble, you have to REALLY NEED things to run concurrently.

There’s a template here showing how everything fits together if you want to give it a shot. I would like to know whether this approach works in contexts outside of what I have. If you try it, please send me some feedback.

3 Likes

Hey @hubschrauber . Why complecate things if you can go with such a simple solution I suggested. Here’s how “parallel” processing works with v0 ordering.

Note how Code 2 starts together with Code 1 and they both run in parallel. Surely “parallel” should be not taken literally.

Compare the outcome with v1 where the 1st branch completes first before the 2nd starts

Here’s the demo workflow itself. You just need to switch between v0 and v1 ordering on the workflow settings and have Dev Tools Console open.

First, I understand that there is a switch to the “Legacy” (v0) execution order, instead of the “Recommended” (v1), default option. Also, as I understand it, that would change the way the entire workflow behaves for all forks in the flow.

I may be reading between the lines regarding the reason for the forked path in the original question, where one path is “Process Info 1”, and the other is “Process Info 2.” I guess I assumed there could be an Info 3, Info 4, etc. in the form input, but that the example had been simplified to focus on whether the paths actually run at the same time. So, yes, switching to the old/previous execution order does fix that simplified case.

Regarding how to literally interpret the term “parallel,” I think it depends on the use case whether it is interpreted as concurrently instead of “during the same section of the overall flow, but single-threaded/one-at-a-time, without changing the execution order setting.” IMO, most people interpret “parallel” as the first thing… “everything runs at once.”

So, if you actually want an arbitrary number of sub-workflows to run at the same time, you need to do something different (and unfortunately, in a more complicated way.) Your workflow demo/example (in v0 execution order mode) is fine as long as you know ahead of time exactly how many forks (logic branches) you need. You also have to be ok with the semi-hidden, non-default behavior change in the workflow settings.

Like I said before, the complexity is only warranted if you actually need things to run in parallel, AND you need to apply intentional logic regarding how many of, or when all-of, the sub-workflows have completed.

One example of where in-series wouldn’t work is if you needed at least 3 approvals from a (dynamically sized) list of “on-duty managers” to proceed. Sending the approval requests out one at a time, would prolong, or likely completely block the process. Without truly parallel operation in the workflow, you’d have to implement some kind of external service to handle the “first 3 of ?? approval responses” logic.

Another example is making multiple calls out to a service that takes a significant amount of time to complete. The use-case that inspired this approach for me was creation of multiple compute resources (virtual machines), but not always the same count. Waiting for each one to be provisioned and configured (sometimes 10+ minutes) was unacceptable when they were requested one by one (time stacked). Nothing else we tried would make all of the “create vm” requests at once, and still wait for all of them to finish before continuing.

It would be really nice if n8n had something like a Join Group node and a corresponding option in the Wait node to Wait for Group but as far as I can tell, it does not yet have such a thing.

I also have some scenarios where changing to the v0 execution order would probably be a simpler approach. However, it does make me nervous that something marked “Legacy” next to something marked “Recommended” might disappear in an upcoming release.

3 Likes

Good, Very Good, Thanks for your explanation.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.