Problem: My workflow needs to produce large amounts of pictures with Dalle (up to 1000).
i was thinking about splitting it in batches and let the batches be executed by different openai accounts.
Is there any way or trick to have parallel executions?
If you have any other idea, how to speed things up, let me know.
Hi @ManyQuestions, n8n will run nodes in sequential order. So using multiple nodes wouldn’t help here as they would run one at a time I am afraid.
Processing a large amount of items will also require a large amount of memory in n8n, so not running all of these operations at once and instead go for smaller batches processed in sub-workflows seems like a more feasible approach. You might want to read Memory-related errors | n8n Docs for more information on the memory topic.
Hi again @MutedJam . Thank for the answer. So would this be looking something like that? - if yes, i don’t understand to advantage of running things in sub workflows? Or are these just for the memory issues and not helping to speed up things?
Oh yes, that’s what I meant. They can help with reducing memory consumption as shown over here for example.
If you really want to start workflow executions in parallel you can use the HTTP Request node (in your main workflow) to start other workflows which are using the webhook trigger.
Uhh yea- That sounds really interesting. How do I do that?
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.