what I expecting is starting the workflow, it will run workflow 1 and workflow 2 in parallel.
but now it runs in sequential order, ie. run workflow 1 → workflow 2, which is kind of unexpected. what am I missing ?
The quick answer to this would be that workflows, at the moment, only execute nodes in sequence - they won’t execute in parallel. In your example, “Execute Workflow 2” will only run once “Execute Workflow 1” finishes.
A workaround for this is to have those sub-workflows triggered by Webhook nodes (set to respond immediately). From the main workflow, you’d execute the sub-workflows using the HTTP node, and then you can trigger the execution of several sub-workflows in parallel
@EmeraldHerald
Could it be possible to trigger subworkflows from within a Code node?
Why: because, i’m primarily wanting to execute a batch of subworkflows in a Promise.all() manner.
Currently, the only 2 options seem to be:
Execute subworkflows sequentially (trigger the next one when the last one is over)
Use the Webhook approach to execute a batch of subworkflows (but now, if i want to execute the next batch, i’ll need to have an ad-hoc Wait node [ which causes further, other issues as well ], or trigger all the possible executions at once, which is usually not feasible )
Hence, i’m trying to figure out a way to trigger a batch of (subworkflow) executions, say 10 at a time, and then wait for all of them to finish before triggering the next batch.
Hi @shrey-42 While you can’t trigger workflows with a code node, it sounds like maybe something like Hookdeck could help you with the webhook trigger timing?
I have tried to add an option to a custom node to allow it to not wait for the sub workflow before continueing in the main flow.
Sadly wasn’t able to. Maybe it is possible but we would need the help of someone that understands the core of n8n.
Hey @EmeraldHerald, if i’m not wrong, Hookdeck (similar to Convoy, which i already use) is an external tool (can’t be operated natively within n8n). In that case, this would really not be an ideal solution to have to manage subworkflow executions using another external tool. (Note: we are already employing 1 level of workaround by using Webhook/HTTP instead of Execute Workflow node).
Also, i think, the problem would still remain wherein the status of the completion of the entire batch ( Promise.all() ) of currently executing subworkflows would be required to be saved & retrieved somehow.
PS: for now, i resort to breaking down the ‘chain-of-workflows’ into ‘async subworkflows’, and use a datastore to manage their status/triggers.
Using rabbitMQ is also a good way to get things going in parallel.
Simply use a workflow that gets it from the rabbitMQ queue and make it start a workflow that was stored in the queue entry body.
So → insert workflow Id to run with data you need into rabbitMQ
Trigger with RabbitMQ trigger → Trigger the workflow that was stored and insert the data into it.
RabbitMQ trigger can then be set to parallel executions of ur liking.
Of course you do need to store it somewhere after it was completed so you can continue a main flow for example.