Parallell execution semantics

The idea is:

These days all languages support parallel execution.
E.g. Java had this since Java5 I remember.
(Of course behind the scenes, CPU power is distributed between different cores and different threads - which makes it in reality more / less “really” parallel - but this matters not for the user).

I do believe it can matter for the user to have, in n8n, a construct to simulate such parallellisme.
I believe this could be done with just 2 Nodes and 2 variables:

  • A splitting Node (from where parallel paths start), simultaneously
  • A Merging Node (where parallel paths come together)
  • A boolean variable added to Nodes in a parallel path to signal whether or not to execute this Node in atomic way (although I would advocate to default this to “Yes”; to avoid a Node execution is broken up).
  • A (boolean) Variable at the Merge Node signaling whether to wait for all paths to come together before proceeding or not (i.e. is a simultaneous exit required? Sometimes all results need to come in before proceeding, not just partial results).
    What I’m not clear about yet is what to do @error (e.g. 1 Node keeps its Thread in wait state, blocking the entire setup → Ignore or throw error/…? Maybe this (and a timeout) could also be set in the Merge/Split Node.

Use case:

Cf. the many questions on this here.

That’s very interesting.

Right now n8n has an execution order for multi-branch workflows, based on the position each branch has in the canvas.

But you can also have parallel executions if you use sub-workflows.

When setting up your “Execute workflow” node, you disable the option “Wait for Sub-workflow completion”

Meaning that the sub-workflow will execute in parallel.

Yeah - I know this is what we currently have. But this is serial execution. Following a graphic criterion. The proposed approach would

  • be parallel execution
  • not be based on graphical position (even if also for parallel execution, such visual order can be kept to repeatedly cycle through the Threads)
  • leverage the underlying language concurrency semantics for its execution

As… no matter how cool n8n is, this omission dates n8n a bit in our era of multi-core, multi-thread, and a universe in which even men multitask haha.

2 Likes