Reducing Disk Usage (Docker Compose, sqlite)

Describe the issue/error/question

n8n is currently taking up a lot of space on the server (~6.5gb). I have:

  1. Followed instructions here (Database.sqlite Error: database or disk is full) by Jon to add the following to my docker-compose.yml file:
 - EXECUTIONS_DATA_SAVE_ON_ERROR=all
 - EXECUTIONS_DATA_SAVE_ON_SUCCESS=none
 - EXECUTIONS_DATA_SAVE_ON_PROGRESS=true
 - EXECUTIONS_DATA_SAVE_MANUAL_EXECUTIONS=false
 - EXECUTIONS_DATA_PRUNE=true
 - EXECUTIONS_DATA_MAX_AGE=90
  1. Have also added to my docker-compose.yml file:
DB_SQLITE_VACUUM_ON_STARTUP=true
  1. Have then saved and restarted docker (docker-compose down, docker-compose up -d), but still do have a very large amount of disk space used:

image

See the /dev/vda1 directory above.

@samaritan, can’t seem to get the logs when entering that command I get an unknown flag: --log-opt so must be doing something wrong there

Information on your n8n setup

  • n8n version: 0.175.1
  • Database you’re using (default: SQLite): I assume SQLite, but not 100% sure (set-up on DigitalOcean using Docker Compose)
  • Running n8n with the execution process [own(default), main]: main
  • Running n8n via [Docker, npm, n8n.cloud, desktop app]: Docker

Hi,

I don’t know the details of your infrastructure and settings but if you’ve installed n8n on Docker, I would suggest you to handle logging with the Docker logging options.

I’ve used the below parameter to minimize logging on Docker.

--log-opt max-size=10m --log-opt max-file=5

Also you can set internal logging of n8n via environment variables.

Hope it helps…

2 Likes

@pb84 please make sure to always fill the question template instead of simply deleting it. We ask those questions for a reason. They do not just make our life easier, they also ensure that we are able to help you properly. Without that information, we have to now do a lot of asking and guessing. What DB are you using, are you running via Docker (if so @samaritan answer could help), …

I now simply assume for now you are using SQLite, in this case, pruning could be the answer. If you look for “prune” in this forum, you should find multiple posts about it. One for example here:

If that does not help, please make sure to provide all the requesting information. Thanks!

1 Like

Apologies will ensure close adherance in future

@jan, have edited to make it easier - thanks

Perfect, thanks a lot.

If you are not sure which database you are running, I am quite certain that it is SQLite, meaning the pruning will very likely solve your problem.

Thanks @jan,

Thanks @jan - I’m trying to figure out what is contributing to the /dev/vda1 folder being so enormous:

image

Any thoughts?

What variables did you set? How did you set them exactly? Did you restart n8n?

No worries:

Please see below for my docker compose.yml file:

I then restarted Docker with docker-compose down, docker-compose up -d

Ah, now see that you added that information in your initial message. Did you restart n8n once or twice? Just to be sure that it did delete the data older than 90h + it did then also prune it?

Did you check how large the .n8n folder actually is? Asuming it is in your home folder that would be:

ls -lh ~/.n8n/

You should then see a line like this:

-rw-rw-r-- 1 jan jan  233M Mai 12 20:31 database.sqlite

That is the actual database file with its size.

Only once, let me try again

So I get this:

(just note that .n8n didnt work so changed it to n8n)

You would have to check the folder to which you did set DATA_FOLDER. Is possible that you did not change the value and you have it set to /root/n8n/.

i have the same setup of docker compose and sqlite I have these environment variables set :

EXECUTIONS_DATA_MAX_AGE=1
EXECUTIONS_DATA_PRUNE=true
DB_SQLITE_VACUUM_ON_STARTUP=true

No executions are deleted when I restart

@pb84 did you get prune to work?

yep all working well

@pb84 how did you get it to work please?

Original fields from my first post worked - I just didn’t appreciate the size of some files clogging up n8n and erroneously raised this ticket - would be worthwhile creating a new ticket if it doesnt resolve

In addition to configuring the codes in the docker-compose.yml file to manage the size of the n8n SQLite database, it’s also important to remember to clean up old and unused Docker images. This is crucial to prevent disk space from being occupied by old versions of n8n.

To perform this cleanup, you can use the following command:

docker image prune -a

This command will remove all unused images, including old versions of n8n. Make sure to run this command periodically, perhaps as part of an update script, to prevent the accumulation of unnecessary images and keep your server efficient in terms of disk space.

1 Like