Support for Parallel Execution in AI Agent Workflows on n8n

This is a challenge in n8n. As an orchestration tool, it is probably designed to assume whatever actions it calls will be either quickly-completed, or offer an asynchronous-callback option.

For the latter, there is an expression variable named $execution.resumeUrl that you can get the value of at any time while the workflow is running. So, you can grab that when you call a service that will use a callback URL. Then, to make the workflow stop and listen for the callback, add a Wait node and set it to Resume: On Webhook Call

Add multiple, parallel callouts to the picture and it gets more complicated quick. Here’s one approach to handling that, but even that has some issues related to the timing of callbacks received.

The BEST way to do this sort of thing, IMO, is with an external service that can accept and track multiple, concurrent tasks, wait for a specified “completion” condition, and then, only once, return control to the workflow. However, implementing that type of service might be more challenging than it’s worth.

Good luck finding something that suits your particular “parallel execution” scenario.

2 Likes