How do I prevent giant JSON blobs on stdout

I have an n8n instance running under docker. Workflows are run via a Jenkins job, that essentially runs node n8n execute --file=/path/to/workflow.json in the container.

I have configure logging to send any logs to file and to only log errors, but on running I end up with (up to) 4.5GB of console logs in Jenkins - there n8n has written enormous blobs of json to stdout.

The pertinent config environment vars are:

EXECUTIONS_DATA_SAVE_MANUAL_EXECUTIONS: "false"
EXECUTIONS_DATA_SAVE_ON_ERROR: none
EXECUTIONS_DATA_SAVE_ON_PROGRESS: "false"
EXECUTIONS_DATA_SAVE_ON_SUCCESS: none
N8N_LOG_FILE_LOCATION: /var/log/app/n8n.log
N8N_LOG_FILE_MAXCOUNT: 10
N8N_LOG_FILE_MAXSIZE: 16
N8N_LOG_LEVEL: error
N8N_LOG_OUTPUT: file

How do I get rid of the JSON blobs in stdout, regardless of whether a workflow has errored or not?

This is n8n 0.227.1 in docker.

Never mind, it looks like setting {"settings": { "saveExecutionProgress": true }} on the workflow might be what is overriding my global config :slight_smile:

1 Like