Webhook stops after ~50-100 requests

Hi community,
I have a problem with my webhook. If I send data to n8n-webhook it works perfectly if I send them manually throught Postman and only 1 per socond. The workflow runs till the end.
But If I start transferring data from our erp system with over 10.000 requests, n8n webhook stops after 50-100 requests and nothing happens after that. If I only transfer abaut 10 request from erp system everything works fine with the workflow.
I also tried to store the requests in supabase database without the workflow, then I recieved about 100 requests till it stops.
I don’t get any error messages in executions. The system still stops and I need to restart the server or docker compose container. After restart n8n webhook receives the next 50-100 requests and stops again.
How can I fix this? I still tried to add these parameters to docker-compose.yaml but nothing changed

  • EXECUTIONS_PROCESS=main
  • EXECUTIONS_MAX_CONCURRENT=10
  • EXECUTIONS_TIMEOUT=3600
  • N8N_CONCURRENCY_PRODUCTION_LIMIT=10
    n8n-version: 1.115.3, Docker compose installation on local/onprem Ubuntu 24.04.3 LTS

{
“nodes”: [
{
“parameters”: {
“httpMethod”: “POST”,
“path”: “LNItemMasterBOD”,
“authentication”: “basicAuth”,
“options”: {}
},
“type”: “n8n-nodes-base.webhook”,
“typeVersion”: 2.1,
“position”: [
1040,
-192
],
“id”: “9ec7b8b1-f83f-443e-9b07-428df40f6ad1”,
“name”: “Webhook ItemMasterBOD”,
“webhookId”: “03bfcfdb-73f1-4a77-bc04-9805d56dbe1b”,
“alwaysOutputData”: true,
“credentials”: {
“httpBasicAuth”: {
“id”: “XbomEHXiwndMjFXr”,
“name”: “Infor IONAPI”
}
}
},
{
“parameters”: {
“tableId”: “LnItemMasterBOD”,
“fieldsUi”: {
“fieldValues”: [
{
“fieldId”: “content”,
“fieldValue”: “={{ $json.body }}”
}
]
}
},
“type”: “n8n-nodes-base.supabase”,
“typeVersion”: 1,
“position”: [
1328,
-192
],
“id”: “428241d3-7fc3-4082-bbb2-866c24bb2426”,
“name”: “Create a row”,
“credentials”: {
“supabaseApi”: {
“id”: “tfWblrTxBgvcFWH3”,
“name”: “Supabase account”
}
}
}
],
“connections”: {
“Webhook ItemMasterBOD”: {
“main”: [
[
{
“node”: “Create a row”,
“type”: “main”,
“index”: 0
}
]
]
}
},
“pinData”: {},
“meta”: {
“templateCredsSetupCompleted”: true,
“instanceId”: “40f96293e3b984cbb67b59071200c7f03ee39e4c36e9741a9ac9ea582acda716”
}
}

Hi @agriaIT,

Based on what im hearing, you are probably spamming and overwhelming your n8n instance with too many requests at once causing the instance to become unresponsive and fall over. A better way to process large sets of data like this would be to rather store the 10,000 records in a file or a database somewhere and then run a scheduled workflow which can pick up data in batches and process it like this. This way you have a single workflow execution dealing with manageable sets of data instead of causing a new execution for every webhook call. It is a lot more expensive on your server resources starting a new execution per record.

1 Like

Hi @Wouter_Nigrini,

that could be a way, but it makes it more complex for me and our erp is not so flexible to do that.

Is it a memory limit in n8n or what else can cause the stop of webhook? Is there anywhere a system error log for n8n?

How does your current erp work, ie how do you receive the 10k records?

In short yes, but it is not a simple answer. No matter what system you use, if you want to handle large datasets efficiently then you will always need to build it the right way. Unfortunately the answer is yes it will be a bit more complex, but then it will be more performant and robust in the end.

How large is each record you’re trying to process. This will also have an impact as n8n is built on top of NodeJS. NodeJS and javascript is not the most performant technologies out there for large data processing.

Tahnk you for your idea. I made it with XML files and read them in batches of 500, store the content in supabase, generate a single sql out of it (because I get blocked from my provider if i send too many sqls/minute over ssh), and send it to my external mysql-database

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.