Trimed big console data output from Execute Node > 1MB overall size

Hi all!
I’m try to run n8n in docker container.
And I need to get a big json data file from shell, it’s about 5 mb.
But, after script performing, i have only 1 084 KB at max.

So, how i can increase max file size?

Welcome to the community @TFL!

Can you please tell me some more about the exact problem you are facing. Because I do not really understand what kind of limit you mean.
What exactly is happening? Do you get some kind of error message?

I tried to recreate your issue by generating a JSON file (in my example 4.5 MB large).
So I generate it and write it to the hard drive as a JSON file. Then I read the same file again. All works perfectly, no matter in which docker container (tried the default Alpine based and also the Ubuntu based) I use.

Here my test workflow:

command in execute node :

ssh user@host "cat big-json-file.json"


In preview data i get my trimmed json :disappointed_relieved:

/home/node # ls -lah /home/node/test.json 
-rwxr-xr-x    1 root     node        1.6M Jul 30 09:53 /home/node/test.json

Simple primer.

hm that issue looks unrelated to n8n. Seems more like it has to do with the terminal or with ssh. Do sadly not really know how to fix it.
What you could try instead is to copy the file locally with scp, read it and then delete it again. That should work with any file-size.

1 Like

Yes. it work with readBinaryFile greatly.
Thank You!

So, if stdout size over 1MB, we can’t use a pipe.
Maybe it’s NodeJS restrictions?