Execution_data column data is too big

Describe the problem/error/question

Is there a way to reduce data column of execution_data table in postgresql? Its also preferable for us to get empty data column instead of getting full binary data in data column.

N8N saves executions to execution_data table with “data”. Sometimes data is a binary file in request . For that reason in postgresql database we see that execution_data table is too big because of that.

Our workflow settings is like that

What is the error message (if any)?

Please share your workflow

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • **n8n version:1.31.0
  • **Database (default: SQLite):postgresl
  • **n8n EXECUTIONS_PROCESS setting (default: own, main):own
  • **Running n8n via (Docker, npm, n8n cloud, desktop app):kubernetes
  • **Operating system:openshift 4.12

hello @Aydemir

You can reduce the DB size by disabling the “Save successful production executions” option in the workflow settings.

According to the workflow, it’s not sometimes, it’s every time after the file is downloaded.

I’m wondering why you need the Code node here. Especially the very old one from the 0.x version.

You could also configure some ENVs to tune the execution saving behaviour

1 Like

Thanks alot

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.