Database.sqlite is very big, how to reduce?

Describe the issue/error/question

I use n8n with docker and default setting,but i found database.sqlite is very big, so how to reduce?

What is the error message (if any)?

Please share the workflow

I just make some workflow, and run every 10s, and after about 24 hours, I found db file is very big.

Share the output returned by the last node

Information on your n8n setup

  • **n8n version:**0.184.0
  • **Database you’re using (default: SQLite):**SQLite
  • **Running n8n with the execution process [own(default), main]:**own
  • **Running n8n via [Docker, npm, n8n.cloud, desktop app]:**Docker

Hi @teshcgv, the maintenance steps to clean up your database are described here in the docs . Since you are using SQLite make sure you also check the part from the Keep in mind box.

This clean up can take a rather long time depending on how much data you have. Going forward you might want to consider only saving executions with errors to avoid having your database grow too fast. You can achieve this on a global level by setting the EXECUTIONS_DATA_SAVE_ON_SUCCESS=none environment variable.

Hi @MutedJam , got it, thanks very much!

1 Like

You are most welcome, let us know if you run into any trouble with this!