Trimed big console data output from Execute Node > 1MB overall size

Hi all!
I’m try to run n8n in docker container.
And I need to get a big json data file from shell, it’s about 5 mb.
But, after script performing, i have only 1 084 KB at max.

So, how i can increase max file size?

Welcome to the community @TFL!

Can you please tell me some more about the exact problem you are facing. Because I do not really understand what kind of limit you mean.
What exactly is happening? Do you get some kind of error message?

I tried to recreate your issue by generating a JSON file (in my example 4.5 MB large).
So I generate it and write it to the hard drive as a JSON file. Then I read the same file again. All works perfectly, no matter in which docker container (tried the default Alpine based and also the Ubuntu based) I use.

Here my test workflow:

{
  "nodes": [
    {
      "parameters": {
        "fileName": "/tmp/test.json"
      },
      "name": "Write Binary File",
      "type": "n8n-nodes-base.writeBinaryFile",
      "typeVersion": 1,
      "position": [
        830,
        300
      ]
    },
    {
      "parameters": {
        "mode": "jsonToBinary",
        "options": {}
      },
      "name": "Move Binary Data",
      "type": "n8n-nodes-base.moveBinaryData",
      "typeVersion": 1,
      "position": [
        650,
        300
      ]
    },
    {
      "parameters": {
        "filePath": "/tmp/test.json"
      },
      "name": "Read Binary File",
      "type": "n8n-nodes-base.readBinaryFile",
      "typeVersion": 1,
      "position": [
        450,
        500
      ]
    },
    {
      "parameters": {
        "options": {}
      },
      "name": "Move Binary Data1",
      "type": "n8n-nodes-base.moveBinaryData",
      "typeVersion": 1,
      "position": [
        650,
        500
      ]
    },
    {
      "parameters": {
        "functionCode": "const returnItems = [];\n\nlet item = {};\nfor (let j=0;j<300000;j++) {\n  item[j] = j;\n}\n\nreturn [\n  {\n    json: item\n  }\n];\n"
      },
      "name": "Generate Test Data",
      "type": "n8n-nodes-base.function",
      "typeVersion": 1,
      "position": [
        450,
        300
      ]
    }
  ],
  "connections": {
    "Move Binary Data": {
      "main": [
        [
          {
            "node": "Write Binary File",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Read Binary File": {
      "main": [
        [
          {
            "node": "Move Binary Data1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Generate Test Data": {
      "main": [
        [
          {
            "node": "Move Binary Data",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}

command in execute node :

ssh [email protected] "cat big-json-file.json"


In preview data i get my trimmed json :disappointed_relieved:

/home/node # ls -lah /home/node/test.json 
-rwxr-xr-x    1 root     node        1.6M Jul 30 09:53 /home/node/test.json

Simple primer.

hm that issue looks unrelated to n8n. Seems more like it has to do with the terminal or with ssh. Do sadly not really know how to fix it.
What you could try instead is to copy the file locally with scp, read it and then delete it again. That should work with any file-size.

1 Like

Yes. it work with readBinaryFile greatly.
Thank You!

So, if stdout size over 1MB, we can’t use a pipe.
Maybe it’s NodeJS restrictions?