When Using Split in Batches Duplicate occurs

Hi everyone, I’m running into an issue with one of my n8n workflows and I’m not sure if it’s a logic mistake on my side or something about how the nodes execute.
I’m pulling data from an API, looping through the results using Split In Batches, and then inserting them into my database. The problem is that sometimes the workflow creates duplicate records even though I thought my logic would prevent that.

Describe the problem/error/question

Inside my Function node I’m trying to map the API response before sending it to the database:const items = $input.all();
for (let i = 0; i <= items.length; i++) {
return [{
json: {
user_id: items[i].json.id,
email: items[i].json.email,
created_at: new Date()
}
}];
}
In my Postgres node:
INSERT INTO users (user_id, email, created_at)
VALUES ({{$json.user_id}}, {{$json.email}}, {{$json.created_at}})
ON CONFLICT (user_id) DO UPDATE
SET email = {{$json.email}};

What is the error message (if any)?

• Sometimes records get inserted twice
• Sometimes the workflow stops after one batch
• And occasionally I get an error like “Cannot read property ‘json’ of undefined”

I suspect I might be doing something wrong in the Function node loop, but I’m not completely sure if that’s the cause or if it has something to do with Split In Batches behavior.

Please share your workflow

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hi @Keira_Becky

I think the issue is coming from the Function node code, not really from Split In Batches.

One thing I noticed is the loop condition:for (let i = 0; i <= items.length; i++)

Using <= will make the loop go one step past the array, so when i equals items.length, items[i] becomes undefined. That’s probably why you sometimes see the error “Cannot read property ‘json’ of undefined”.

Another thing is the return inside the loop:for (let i = 0; i <= items.length; i++) {
return [{

Once JavaScript hits return, the function stops immediately, so it only processes the first item, which could explain why your workflow sometimes stops after one batch.

Try this instead:

const items = $input.all();
const results = ;

for (let i = 0; i < items.length; i++) {
results.push({
json: {
user_id: items[i].json.id,
email: items[i].json.email,
created_at: new Date().toISOString()
}
});
}

return results;

Also double-check that user_id is actually set as UNIQUE in your database, otherwise duplicates can still happen even with ON CONFLICT

2 Likes

Hi Welcome! the main issue is that your return is inside the loop so it exits on the very first item every time, thats why you get duplicates and it stops after one batch. Replace the whole loop with this:

return $input.all().map(item => ({
  json: {
    user_id: item.json.id,
    email: item.json.email,
    created_at: new Date()
  }
}));

That should fix all three problems at once.

Hey! I can see a couple of issues here that are likely causing your duplicates.

Bug in your Function node loop:

for (let i = 0; i <= items.length; i++)  // ❌ <= causes off-by-one

Should be i < items.length (not <=). The extra iteration reads items[items.length] which is undefined, causing the “Cannot read property ‘json’ of undefined” error you’re seeing. When it crashes mid-loop, n8n may retry the batch — which is where your duplicates come from.

Also, your Function node is returning inside the loop (only first item gets returned). Should be:

const items = $input.all();
return items.map(item => ({
  json: {
    user_id: item.json.id,
    email: item.json.email,
    created_at: new Date()
  }
}));

The bigger issue: Split in Batches re-runs all downstream nodes for each batch. If your workflow errors partway through (from that undefined crash), n8n may re-execute from the top depending on your error handling settings — running some records twice.

Fix checklist:

  1. Fix the loop (use map instead)
  2. Your ON CONFLICT DO UPDATE in Postgres is actually already your safety net — if the upsert is working, duplicates shouldn’t persist. Check if user_id actually has a UNIQUE constraint on it
  3. Consider using n8n’s native “Edit Fields” node instead of a Function node for simple field mapping — less surface area for bugs

Should clear it up!

1 Like

Hey @Keira_Becky, 2 things on top of what the others said about the loop:
Your raw SQL is injecting string values without quotes so{{$json.email}} will cause syntax errors on most real data. Skip the Execute Query approach entirely and switch your Postgres node to the built-in Upsert operation with user_id as the conflict column.

Also confirm that user_id has an actual UNIQUE or PRIMARY KEY constraint in your database cuz ON CONFLICT does nothing without it.

1 Like

Thanks @Niffzy @achamm @OMGItsDerek @houda_ben for the reply

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.