Microsoft SQL Node Insert Problem - Bug


I am running n8n version 1.14.2.

I try to insert a xls file do a Microsoft SQL Database after i have parsed it and used the Set Node but it throws the following error.

The operation i use from the Microsoft SQL node is INSERT and i just put the columns below.

The file has 4376 rows and 4 columns. On previous versions of n8n 1.9.3 it was working fine no matter how many rows.

I tried with the latest version of n8n to use a Loop Over Items node. I put 1500 per batch. Same error happens.

It only works if i set the split batches 524 and bellow.

That is not the right behavior, it does not make sense.

ERROR: The incoming request has too many parameters. The server supports a maximum of 2100 parameters. Reduce the number of parameters and resend the request.

RequestError: The incoming request has too many parameters. The server supports a maximum of 2100 parameters. Reduce the number of parameters and resend the request.
    at handleError (/usr/local/lib/node_modules/n8n/node_modules/n8n-nodes-base/node_modules/mssql/lib/tedious/request.js:384:15)
    at Connection.emit (node:events:517:28)
    at Connection.emit (node:domain:489:12)
    at Connection.emit (/usr/local/lib/node_modules/n8n/node_modules/n8n-nodes-base/node_modules/tedious/src/connection.ts:1906:18)
    at RequestTokenHandler.onErrorMessage (/usr/local/lib/node_modules/n8n/node_modules/n8n-nodes-base/node_modules/tedious/src/token/handler.ts:388:21)
    at Readable.<anonymous> (/usr/local/lib/node_modules/n8n/node_modules/n8n-nodes-base/node_modules/tedious/src/token/token-stream-parser.ts:23:7)
    at Readable.emit (node:events:517:28)
    at Readable.emit (node:domain:489:12)
    at addChunk (node:internal/streams/readable:335:12)
    at readableAddChunk (node:internal/streams/readable:308:9)
    at Readable.push (node:internal/streams/readable:245:10)
    at next (node:internal/streams/from:98:31)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)

Hey @Mulen,

It looks like this is actually a SQL Server limitation where you are only allowed to use 2100 parameters (Maximum capacity specifications for SQL Server - SQL Server | Microsoft Learn)

It looks like there could be ways to get around this by chunking the data which could be something we should look into but for now it looks like your approach is the best method to take.

@Jon It is clearly some bug.

I spin up the older 1.9.3 version of n8n and did exactly the same thing. The excel with the 4376 rows got inserted without the Loop Over Items node

Please check it.


Hey @Mulen,

Interesting when I did a quick search on the error it shows it as being a documented SQL limit, I will check to see if we made any changes to the MSSQL node but it will likely be the same thing, We would need to look at chunking the data into smaller pieces to work around the limit but for now the split in batches is the best approach to work around this.

Thanks for the reply.

For the time being i will use the Loop Over Items but as you can see with the older version 1.9.3 it is working just fine. Something definetely must have changed with the latest version.

There have been a few changes I suspect it changed as part of this: fix(Microsoft SQL Node): Prevent SQL injection (#7467) · n8n-io/n8n@a739245 · GitHub

Thanks. So that means will you look at it internally and maybe provide a fix?

Hey @Mulen,

Yeah, I will get a ticket created so I don’t forget but I suspect this will be a low priority.

1 Like

Hello Jon.

I don’t think it must have a low priority. This issue is very counter productive. MSSQL can handle many inserted rows at once.

Look this example with the Debug Helper node.

It can insert 419 rows but not 420 rows.

The version of MSSQL Server i am running is Microsoft SQL Server 2016 (SP2)

With the older 1.9.3 version you could at least insert 100.000 rows in a single batch.

Please make it work like it used to.

Hey @Mulen,

The difference is likely to be around the change to parameters to help prevent sql injection, In theory you could use the Execute Query option as well which should handle it ok.

When it comes to parameter inputs MSSQL has a limited amount it can handle in one transaction, to fix this we may need to either revert the change we made or handle the batching in the node instead of outside of it.

Our priority is set based on how popular the node is, how many users may be impacted by the issue based on the telemetry data and some other factors, In this case when I created NODE-909 (our dev ticket) and inputted the information for scoring it has come out as a low priority at the moment so we will fix this but there is currenlty a work around available.

If we start to see more reports of this error the priority will be raised, Just because something is marked as low it also doesn’t mean that someone won’t pick it up and put out a fix for it.

1 Like

Expressing exactly the same issue with latest version, also confirming working with version 1.9.3.

Please confirm a cause and fix.


Hope they do. Had to change many workflows because of that. Underutilizing the abilities of n8n with this “bug”


I’m having the same problem with n8n version 1.17.1.

Yes it’s affecting everyone who use this node functionality with latest versions of n8n. Hope they fix it.

Just to add… This doesn’t appear to be impacting everyone but it has the potential to based on the amount of items they are sending.

I have checked the ticket and this is still in backlog to be picked up.

I’m having the same problem with n8n version 1.18.0

1 Like

You have to rebuilt the workflow and use the batch node to do small inserts everytime until this is fixed.

I applied your temporary solution! tks for that! But, the problem continues in version 1.18 as well

Hey @Daniel_E,

That is sadly correct, We have not yet picked up the ticket to look at why we are sending more than SQL can handle.

I have just raised the priority of this ticket as well.

Hi all,

Any news on this issue ?