How to have ChatGPT nodes run in parallel?

I want my cold email workflow ChatGPT nodes to write the batch of emails in parallel. I have a code node sending the batch of emails individually to each channel out the Switch node to ChatGPT nodes. Unfortunately, it is processing them sequentially even though all the wires lite up green at the same time coming out the ports on the switch. Is this just not currently possible in n8n or, hopefully, someone can tell me how to accomplish parallism.

Describe the problem/error/question

What is the error message (if any)?

Please share your workflow

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

The way it is progressing through the workflow is intentional.

There are two modes of execution -

  • executing branch by branch from the top (default), sequentially, and
  • executing one node from each branch, branch by branch sequentially.

Neither is parallel. You can achieve a certain degree of concurrency by utilizing subworkflows, if the architecture of your workflow allows. For that you would need to fire up subworkflows with Wait For Sub-Workflow Completion disabled.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.