Duplication of Data Upload

Hi Everyone,

I’m having an odd problem, that I have tried to figure out and not sure if I have just looked at it so much I can’t see it and it’s obvious. On some runs, I am getting the data I need transferred into Bigquery twice. It doesn’t happen all of the time. I assumed it was something to do with a retry somewhere, but I am pretty sure, I have turned them all off. I have two flows, one main one and a sub-flow the cycles through the list from the first flow. The list is showing correctly, with one item per line and no duplicates, and when I run the sub-flow manually, it runs smoothly each time.

Any help would be greatly appreciated!

This is the main flow

And this is the sub-flow;

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:
  • **n8n version: 1.20.0
  • **Database (default: SQLite): PostgresDB
  • **n8n EXECUTIONS_PROCESS setting (default: own, main): Queue
  • **Running n8n via (Docker, npm, n8n cloud, desktop app): Docker on Railway

Hi @Damien_Kelly, I am sorry you’re having trouble.

Unfortunately I am not able to reproduce the problem based on the data you have provided. How does the execution data of your sub-workflow look like for an affected dataset? Are there duplicate items ending up at one of your BigQuery nodes?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.