I’m having an odd problem, that I have tried to figure out and not sure if I have just looked at it so much I can’t see it and it’s obvious. On some runs, I am getting the data I need transferred into Bigquery twice. It doesn’t happen all of the time. I assumed it was something to do with a retry somewhere, but I am pretty sure, I have turned them all off. I have two flows, one main one and a sub-flow the cycles through the list from the first flow. The list is showing correctly, with one item per line and no duplicates, and when I run the sub-flow manually, it runs smoothly each time.
Unfortunately I am not able to reproduce the problem based on the data you have provided. How does the execution data of your sub-workflow look like for an affected dataset? Are there duplicate items ending up at one of your BigQuery nodes?