I’ve created a simple loop used to paginate a GraphQL api call.
My challenge is that the subsequent nodes will run each time the loop runs instead of only running when the loop is done.
Any idea how to restructure to achieve this result instead?
Information on n8n setup:
n8n version: 0.191.1
Running n8n via Docker
You will need to check if the splitinbatch is done processing with an IF node and then follow the loop or go on with the workflow. There is a context field in the SplitInBatches node that you can check in the IF node.
There is also a Community node that makes your life easier:
The same functionality as the splitinbatches node, with an added Done path to continue on after being finished. Including the option to combine run items.. Latest version: 0.1.0, last published: 5 days ago. Start using...
You can add an if node between Get Order IDs and Data that checks if the Split in Batches node is done. You didn’t share your workflow so below I have roughly mocked it up for you, The If node has everything you need in it.
Thanks for the reply, Jon!
I didn’t quite manage to get this to work, maybe you can point out what I’m missing here.
What happens is:
splitinbatches delivers 2 items
graphql node makes a post request for each of the two items
IF node checks if splitinbatches is done
if FALSE, splitinbatches delivers another 2 items
if TRUE, splitinbatches delivers the last 2 items in the TRUE branch
What I need is when splitinbatches is done, the entire output from all delivered items is written to a node that can then process the entire output once.
Thanks for your patience
You would need to use a function node to merge all the items so something like this… This will merge all the items from the graphql node, If you need more data it will need a different function.
Thanks a bunch Jon, that worked wonders!
Ran into an issue with the “execute workflow” node, which just keeps running and not executing the next workflow, eventhough when I manually run each workflow myself, it runs just fine.
Will make a dedicated post.