Clean Duplicates Database Not Working

Problem

I’m using the webhook processor with queues. I have the worker node replicated 4 times.
I am using the remove duplicates node. It has run out of space, so now it just returns the message:

[
  {
    "error": "The number of items to be processed exceeds the maximum history size. Please increase the history size or reduce the number of items to be processed."
  }
]

However, when I go to clean the database, nothing happens. There’s no output values and I continue to get that error.

How do I clear the database?



instance information

Debug info

core

  • n8nVersion: 1.95.3
  • platform: docker (self-hosted)
  • nodeJsVersion: 20.19.2
  • database: postgres
  • executionMode: scaling (single-main)
  • concurrency: -1
  • license: enterprise (production)

storage

  • success: all
  • error: all
  • progress: false
  • manual: true
  • binaryMode: memory

pruning

  • enabled: true
  • maxAge: 336 hours
  • maxCount: 10000 executions

client

  • userAgent: mozilla/5.0 (macintosh; intel mac os x 10_15_7) applewebkit/537.36 (khtml, like gecko) chrome/137.0.0.0 safari/537.36
  • isTouchDevice: false

Generated at: 2025-06-13T03:54:20.376Z

Clear Deduplication History with Clean Database mode should delete all history associated with the selected scope (node ​​or workflow). The default scope is node, which means it cleans only the database for that instance of Remove Duplicates. If you have multiple workers with the same node on different instances, each maintains its own history. In a scaling environment (multiple replicas), each replica has an independent history. As long as the other replicas don’t run the node in clear mode, its history will remain intact.

If you want shared history between replicas, change Scope: Workflow. If you want each replica to be independent, continue with Scope: Node.

With Workflow scope: Place a node in Clear Deduplication History – Clean Database and run it once. This will delete the shared database. With Node scope: You must run this option on each replica, ensuring that all of them clean their local database.

If you need to handle more than 10,000 items, adjust the History Size parameter in your node, Marketplace → Remove Duplicates → Operation: Remove Items Processed in Previous Executions → Keep Items Where: Value Is New → History Size: (for example) 50,000.

If you submit many items at once, consider splitting the input into smaller batches (using SplitInBatches), processing, and then cleaning them up to avoid memory or limit errors.

But I cannot clear the databases of stored values. Nothing happens.