Google Sheets “Append Row” drops rows under high webhook concurrency

I’m running into an issue with n8n + Google Sheets and I’m hoping someone here has solved this before.

Setup

  • Incoming webhook receives ~100 POST requests

  • Requests aren’t perfectly simultaneous, but they arrive very quickly and do overlap

  • Flow:

    1. Webhook receives request

    2. HMAC hash calculated and validated

    3. If valid → Google Sheets “Append row” node

    4. If invalid → return 403

From n8n’s execution history, all ~100 webhook executions complete successfully. No errors, no retries, no failures.

The problem
When I check the Google Sheet, I consistently end up with far fewer rows than executions:

  • Sometimes it stops at row ~56

  • Other times ~63

  • It’s not consistent, but it’s always well short of 100

No rows look corrupted — they’re just missing entirely.

What I think is happening
My hunch is that multiple executions are trying to append at the same time, and the Google Sheets node is effectively overwriting or colliding internally, even though it’s using “Append row”.

I assumed the whole point of append was to guarantee “add only, never overwrite”, but the behavior doesn’t match that expectation under load.

I’ve seen similar questions asked before, but I haven’t found a solid, confirmed solution.

What I’ve already considered / tried

  • There are no errors reported by n8n

  • Each execution reaches the Google Sheets node

  • I looked for a way to:

    • Limit the webhook to one active execution at a time, or

    • Queue incoming requests

  • As far as I can tell, n8n doesn’t currently support serializing webhook executions like that

Please share your workflow

Information on your n8n setup

  • n8n version: 2.4.6
  • Running n8n via: Cloud

Hi @ai-anythng, welcome!

This is a Google Sheets issue, not an n8n limitation, most likely related to concurrency and API limits..

You can try enabling the option Minimize API calls in google sheet node:

If that doesn’t help much, as you mentioned you’ll need to look into solutions like queuing requests, adding delays, or replacing Google Sheets altogether..

1 Like

My first instinct would be to save the webhook requests to a data table as an intermediate store, and then have a second workflow running on a schedule to read the data table rows and append to the spreadsheet.

This assumes the data table is stored in a database with the ACID properties, like PostgreSQL (meaning it shouldn’t drop inserts like this). I don’t know how n8n actually stores data tables, but it seems likely they would use PostgreSQL, SQLite, or something similar.

1 Like

Solved! I figured out a solution by pre-populating rows with unique identifiers, then updating them instead of appending.

My scenario: I have multiple unique URLs (URL1, URL2, URL3) and need to capture content from each one that will ultimately be written to the google sheet.

Solution:

  1. First, establish all rows with URLs in column B using a single HTTPS request:

This writes one URL per row in column B.

  1. Then, configure the Google Sheets node to UPDATE instead of append:

    • Set the matching column to column B

    • As each request comes in with the captured data, it will match against the URL in column B and update that specific row

This approach ensures each URL gets its own dedicated row that can be updated as data is retrieved, rather than creating duplicate rows with append operations.

1 Like