Loop really needs two separate inputs

The idea is:

Loop should have two inputs. One for the array of data to be batched, and one for the trigger to process the next batch. Having those both be the same input causes confusion and problems.

My use case:

This flow should “pump” a batch of 20 records at a time through a process that then operates on them one at a time. But because Loop only has one input, when it gets the second batch from the Airtable module it automatically sends it to the “done” output. I’m not sure if that’s what I expect, but it’s definitely not what I want.

More broadly, it’s impossible to have a loop within a loop unless the input and the next trigger are separated.

I think it would be beneficial to add this because:

It will make flows visually easier to understand and also make it possible to have loops within loops.

hello @Lee_S

Can’t test it with Airtable, but with Code node everything works.




Here is an example where I tried to have a loop within a loop. I have a few dozen large CSV files that are cut up into smaller chunks for processing. This is the “Cleanup” workflow.

1)List the files that are ready to be cleaned up
2)List all of the chunks that go with that file
3)Delete those chunks from S3 and from Postgres chunks table
4)Move the original file to an archive folder and delete from postgres files table

The outer (furthest left) loop iterates over the files; the inner (furthest right) loop iterates over the chunks.

THE PROBLEM HERE IS EXACTLY WHAT I RAN INTO IN MY ORIGINAL POST. That is, once the inner loop thinks that it’s “done”, then when the next batch of input comes in (i.e. the next list of chunks to delete), it just outputs those items to the “done” branch instead of processing them on the loop branch.

As you can see in my screenshots, the items from run 1 went to the Loop branch. Later runs all went to the done branch. I’m not sure if the totals in these 7 runs are cumulative or if I just happen to be processing the files in order of smallest to largest, but the average file is 800-1600 chunks so it’s probably cumulative.

Lee

Well… The nested loops can behave weird and might not work at all. Better to divide the workflow into smaller ones and call the sub-workflow inside the loop. Your sub-workflow may have its own loop node and that is totally fine.

Better to divide the workflow into smaller ones and call the sub-workflow inside the loop.

OR, give the loop module two separate inputs so it doesn’t have to figure out whether you’re sending new data to go into the queue or triggering the next batch to pull from the queue. :slight_smile:

That one wouldn’t work with nested Loops because of how the Loop node works :slight_smile:

It modifies the $runIndex of the input along with $inputIndex. When there is a nested loop, the $runIndex already modified (so the nested loop may think that it already finished the work). And if it will be modified by the nested loop, then the outer loop will mess with the items.

it’s not an issue with input, it’s an issue with the Loop logic itself. And I think it will be too difficult to change that as there is only two types of indexes: InputIndex and RunIndex. So how it should be described in the nested loop? As SubRunIndexY of the RunIndexX ? :slight_smile:

Maybe @Jon can add more details to the Loop logic