Loop Over Items can't handle multiple incoming lists?

Should we convert this thread into an issue?

Hi @mriioos, welcome!

I’m listening :saluting_face:
I just saw you mention this topic in another category, so I’ll reply here.

Yes, I’m able to reproduce this,
but IMO this is an incorrect design, especially when working with loop nodes:

You’ve already closed the first loop, so you should link the next node to the done branch, like this:

IMO when working with loops, you need to be fully aware of how they work and how they behave, and design your workflow accordingly..

First of all, thank you very much for the response, but I still have a couple of questions:

About nesting loops:
Doesn’t this bloat the memory if the first loop produces large files? How would I process large files if I also require a loop to process them and I can’t use it inside the first loop?

I had to do this subworkflow-nesting trick in other workflows when I found this nested-loop thing (I found this subworkflow trick as a workaround solution online).

About subworkflow behaviour:
Shouldn’t the behaviour of a subworkflow execution be exactly the same as not using the subworkflow at all? At least in terms of item processing. Similar to how a function would behave in a code flow. Isn’t that the point of subworkflows?

About my use case:
I get your propposed solution but I don’t see how to apply it to my use case, where I use a default loop (with no loop-over-items to produce the batches):

In this case, I was forced to not use a Loop Over Items node because I needed to feed back the result of ‘Split out directories’ back into the loop, to have a recursive reading of the directories. If I am not mistaken, this is not possible with the Loop Over Items node.

As a note: my workflow intends to list all directores, recursively, and process all files inside, that is why the result of ‘Split out directories’ must feed back the results to the loop (where the stop criteria is until no more directories are left).

I tried to explain my self as best as I can, but I might fail to do so in some things as english is not my mother language. If anything is not clear, feel free to ask.

Again, thank you very much.

I have made a second example that ilustrates the isolated issue with Loop Over Items in my use case:

In this example, you may appreciate that Loop Over Items can’t be used because it doesn’t acccept a feedback loop.

what about using a loop here:

isn’t this what you looking for?

1 Like

Not really, my goal is to extract all ‘leaf items’ (items that have no subitems) and loop over each of them, one by one, afterwards. Which in this case means I must process the true branch of the if statement.

We can think of this the other way around. How would I extract all leaf items from the pinned data? (with the condition that the workflow must use recursion, as in the original use case, an async recursion chain is the only way to crawl all sub directories).

Anyway, I still don’t understand why logic encapsulated inside a sub-workflow doesn’t behave the same way as not encapsulating it at all (at least in the logical aspect of it).

1 Like

hmm, I think at this point you shouldn’t use loop nodes,

It’s not that they can’t do it (I still believe they can, it just requires some over engineering/thinking), but the logic is getting a bit complex with all the conditions, feedbacks, nesting, and aggregation..

I think I’d go for JS or Python code, it’s much easier and more appropriate here..

I rarely use Code nodes, I prefer visual nodes I can see right in front of me, but I think I’d use one here for sure to manipulate and extract everything I need from the JSON..

2 Likes

Yeah, the only problem is the naturally async behaviour of accessing recursively a remote directory. I am not sure how this could be done. I feel like loop-over-items nodes should not work like this, but I guess that is just an opinion. I surrender, I may just use my workaround and pray haha.

Any way, thanks for the help both to you and @Wouter_Nigrini .

1 Like