Duplicate Keys from PG node

I have a Postgres Insert node that inserts data into a staging table - this works 100%. This node is followed by a Postgres Execute Query node, that executes the following SQL:
INSERT INTO productline SELECT DISTINCT productline_id, title FROM _raw_catalogue;

But, it seems that the second node is execute once for every record inserted in the first node - causing “duplicate keys” exceptions. Is there another way to do this type of workflow?

For now, I have got around this by using the following SQL:
INSERT INTO productline SELECT DISTINCT productline_id, title FROM _raw_catalogue ON CONFLICT (id) DO NOTHING;

But, this feels like a bit of a hack?

Ah yes, nodes get normally executed once per item. In the future it is planned to be able to have a toggle on the nodes to make it possible that they only execute once.

Anyway, for now the easiest thing would be to remove all the other items with a Function-Node. So the code on it would look like this:

return [items[0]];

Here an example workflow which demonstrates this:

{
  "nodes": [
    {
      "parameters": {
        "functionCode": "return [\n  {\n    json: {\n      id: 1,\n    }\n  },\n  {\n    json: {\n      id: 2,\n    }\n  }\n];"
      },
      "name": "Mock DB-Data",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        400,
        300
      ]
    },
    {
      "parameters": {
        "functionCode": "return [items[0]];"
      },
      "name": "Only First Item",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        550,
        300
      ]
    }
  ],
  "connections": {
    "Mock DB-Data": {
      "main": [
        [
          {
            "node": "Only First Item",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}