Can not vacuum database.sqlite

Good day!
Try to update n8n on my server today, but catch errors, because disk was full
Searched here probllems and undurstand, that problem is in database.sqlite

But the probles is that i can not vacuum it, because disk is full.


What i should to do?

Hey @artemik83,

Welcome to the community :cake:

That looks like you might not have enough space to perform a vacuum, When the vacuum works it makes a copy of the data then does some magic and copies the data back so you would need to make sure you have enough space for this to happen.

You can find more information on the sqlite vacuum process here: VACUUM

What I would do is export the workflows and credentials from the database, delete the database file then start up n8n and import the data again then make sure the vacuum option is set and everything should be good to go or use it as an opportunity to move to something like Postgres instead.

1 Like

i have no space to vacuum :frowning:
how can i export all workflows and credentials?

Hey @artemik83,

You can use the CLI options here: CLI commands - n8n Documentation

if i use docker compose, i have to write this?

docker-compose exec -u node -it root_n8n_1 export:workflow -- all

this blocks everything
idk what to do :frowning:

What does the line under it say?

Hey @artemik83,

If you have n8n running running already make sure you are not trying to run it again so you would just run the command with the exec command.


Hey @artemik83,

Have you confirmed that there is only one instance of n8n currently running and nothing else attached to the database like the sqlite command tool?

I tried to update n8n today, but an error came up during migration (that the database takes up a lot of space)
I am a complete 0 in programming and command lines, so I don’t even know how to answer you exactly

Hey @artemik83,

Your post showed that you had manually ran the sqlite command, Do you still have that command connected to the database? If in doubt I would restart the server and see if that helps.

my logs here

just restart server - same

The easiest option here if you can’t unlock the database will be to copy the file to another machine the do the export there.

Have you checked how much space you have available on the machine and cleared any old docker images to see if that frees up enough space?

i tried to clear old docker with

docker system prune --all

but i didn’t help.

more logs, if they help

Are you still seeing the database locked message when using the CLI tool? You could try downgrading to an earlier version of n8n and seeing if that will start up to allow you to export the workflows.

As a possible future thing we do have n8n cloud where all of this is managed for you so you don’t need to have any experience managing a server and dealing with issues like this.

how to downgrade via docker-compose?

Hey @artemik83,

Same way you would with other docker images, In your compose add the version you want to use at the end of the image name so if you were using an image of n8nio/n8n:latest you would change it to something like n8nio/n8n:0.210.0 or whatever your previous version was.

1 Like