ERROR: stdout maxBuffer length exceeded

Describe the issue/error/question

I am running a bash command as an Execute Command Node - when executing the node it keeps displaying maxBuffer length exceeded.

Anyway that i can increase the maxBuffer in n8n/npm? I am running n8n desktop client on Mac OS.

What is the error message (if any)?

ERROR: stdout maxBuffer length exceeded

Please share the workflow

Share the output returned by the last node

NodeOperationError: stdout maxBuffer length exceeded
    at Object.execute (/Applications/n8n.app/Contents/Resources/app/node_modules/n8n-nodes-base/dist/nodes/ExecuteCommand/ExecuteCommand.node.js:74:27)
    at runMicrotasks (<anonymous>)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
    at async Workflow.runNode (/Applications/n8n.app/Contents/Resources/app/node_modules/n8n-workflow/dist/src/Workflow.js:594:28)
    at async /Applications/n8n.app/Contents/Resources/app/node_modules/n8n-core/dist/src/WorkflowExecute.js:537:49

Information on your n8n setup

  • n8n version: 1.6.0
  • Database you’re using (default: SQLite):
  • Running n8n with the execution process [own(default), main]:
  • Running n8n via [Docker, npm, n8n.cloud, desktop app]: desktop app

Hi @sectest1983,
welcome to the community :tada:

Hmmm sadly I don’t see us setting a maxBuffer size in our ExecuteCommand.ts, so I don’t have a solution for you here.

@MutedJam Do you know anything about this?

Hello @sectest1983,

Maybe try to redirect the output of the command to a file and then read the file.

Hi,

Same error.

It seems that, whenever I run a commandline tool that reads from a file or stdin/stdout - then it fails with the error. (Seems to be related to Go binaries, when executing native cat command it reads a file perfectly)

e.g.

I can run command /usr/local/bin/tool1 -h
And n8n displays the output of help - but when I supply a switch to read from a file og stdin then it fails. copy pasting the same command into the host terminal works perfectly fine.

Hey guys, tbh I am not sure if this is an n8n limitation, a Node.js limitation or happening on another level. The error seems to suggest there’s too much being written to stdout though.

@sectest1983, are you getting the error when writing into a file? Or only when reading the file? If it’s the latter, are you trying to read the file using the Execute Command node? If so, could you try using the Read Binary File node instead (this should avoid using stdout)?

Hi @MutedJam

I did some more testing - did also try with the Read Binary File - same result. But have additional details I can provide from testing.

When I run the command against a file with only one entry/1 line. It does not throw the maxbuffer error. But instead the node is just hanging forever in “executing” no feedback.

So it seems I have 2 error symptoms:
when text file input contains many lines it throws the max buffer error (not sure about the limit).
when text file input only contains one line, it n8n node seems to hang in “executing…”.

Above seems to only happen when running a binary compiled by golang.

It is occuring when I am trying to execute certain go binaries that reads from input - e.g. when executing a --help switch to the golang compiled binary it prints out the help message in stdout fine.

I have a need of running some golang utils in a workflow that reads from either input file or stdin. I tried with native Curl on my Mac OS, and a basic python3 script which seems to execute fine.

When I tried with a python3 tool that had a lot of thirdparty py modules, it fails executing node, due to it could not locate the already installed modules which was present in the host os.

I am running the n8n desktop app, replicated the same in the npm version self-hosted.

I did also try to debug, with the n8n environment variables for logging, but when using the desktop app and re-producing the problem it did not log anything even verbose level was configured.

It also seems that n8n Execute Command node only has limited access to the host system variables, modules etc. are there any way to give n8n full access? So that e.g. can access PATH and Modules, so i do not have to type in the full file location, missing python modules etc.

Thank you for assisting…

Hey,
I’v got same issue here with Execute Command
simply execute : powershell "Get-ADUser -Filter * -Properties * | ConvertTo-Json".

Few weeks ago, working like charm, since i update to latest (n8n & npm) i’v got this issue :

NodeOperationError: stdout maxBuffer length exceeded
    at Object.execute (C:\Users\jeremy_bepoix\AppData\Roaming\npm\node_modules\n8n\node_modules\n8n-nodes-base\dist\nodes\ExecuteCommand\ExecuteCommand.node.js:76:27)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
    at async Workflow.runNode (C:\Users\jeremy_bepoix\AppData\Roaming\npm\node_modules\n8n\node_modules\n8n-workflow\dist\src\Workflow.js:645:28)
    at async C:\Users\jeremy_bepoix\AppData\Roaming\npm\node_modules\n8n\node_modules\n8n-core\dist\src\WorkflowExecute.js:557:53

It seems your commands seem to return a longer output than n8n can handle as described by @marcus above. Any chance you can write the result in a file rather than to stdout?

Hi,
I reduced the return of the LDAP request with filters and it works.
Just select only active users, it reduces a lot
Get-ADUser -Filter {enabled -eq $true} -Properties pwdLastSet | ConvertTo-Json

1 Like

I also have this issue. Appears n8n is using child_process.exec which has limitations on buffer size, see here. I’m guessing n8n does not provide any means of passing arguments into your child process calls?

I have tried using techniques like stdout redirect and unbuffer, no effect. I think many people doing serious data processing in a n8n workflow will run into this.