Delete logs in Postgres DB

Hi,

In regarding to Hard disk full - where to delete logs? I was trying to delete old log files from the database, but I haven’t succeed unfortunately.

I’m running n8n inside a docker container, which is inside CapRovers master container on a Digital Ocean droplet. Inside Caprover I created a “one-click-install-app” and with that n8n comes standard with a Postgres database.

My problem is that I can’t find a way to set up a workflow to automaticly remove any log rows inside the database. What I haven’t tried is set any environment variables which will automaticly prune the data. I’m to scared any data will be lost, what shouldn’t be lost.

Is there anyone who can help me with this?

Much thanks,

Jasper van Doorn

Hey @prikr!

Welcome to the community :tada:

I haven’t used CapRover extensively with n8n. I love that project, but sometimes I feel like I am missing the flexibility.

Coming to your question, what logs do you want to delete? Do you want to delete the executions that get saved?

@harshil1712 Yea, I feel what you mean indeed. But, for me, as a newby in the docker environment it is a good to go solution. :slight_smile:

Well, in like 3 months, I noticed around 77000 excecutions were listed inside the “excecution list”, inside the workflow app. I deleted them, but the database is still 96GB. So, I think that the logs are still there, not deleted? I can’t imagine 1 workflow (with now litterly 1 excecution listed) is 96GB.

If the data got deleted, space should be available. I am not too sure what is going here. Did you try restarting n8n after deleting the logs? Also, you can set up the environment variables mentioned in our documentation to allow n8n to automatically delete the data based on the time you specify.

Hey @prikr

Maybe what is happening is that Postgres is keeping the files created without freeing up space. This is a known behavior from Postgres.

Maybe running a vacuum operation can solve your problem (PostgreSQL: Documentation: 9.2: Routine Vacuuming)

I am not sure if this can be related to n8n itself as we don’t keep this many logs and logs to n8n were introduced recently and also are turned off by default.

Let me know if this works for you!

Hy Krynble,

I guess you’re sort of in the right direction. I have deleted the executions and it looks like Postgres is keeping the rows in the database. Could this be an improvement for n8n too?

But, after doing some research with adminer.php and other tools I found out that the actual data is being mounted in my Digital Ocean droplet. So, before i’m able to do any Vacuum operation, I will need to find the actual data first lol.

I’m thinking of deleting the complete app and just create a new one, because I only have one workflow which I can copy easily.

Hey @prikr

If you have access to adminer then you probably can run the vacuum operation easily.

In a local test I ran an SQL command with the vacuum; statement and it seems to work. Worth a try IMO.

Yea, i tried indeed @krynble,

But, If you look at the image: There are no tables in the database? Or am I searching wrong? Sorry for my inexperience here, but I’m new to postgres!

Hum, that is interesting. You should be seeing n8n tables in there, yes.

In any case, the vacuum operation is “global” so if you click the SQL Command link in the left panel you should be able to run it just fine and see if it worked.

That didn’t work neither. Have tried it already too. VACUUM FULL neither. @krynble

Ok, this is strange. If you check other databases, are all of them also empty? Because this might indicate that the connection is being done to another postgres instance that is not the one being used by n8n.

You should see the following tables:

credentials_entity
execution_entity
migrations
webhook_entity
workflow_entity