Hello,
I am hosting n8n on Cloudron (It’s using docker) and using Postgres as a DB.
My database is more than 30 GB in size. It it not normal as the only workflows I run don’t store data on the machine itself but on an external database. How can I solve that issue?
Regards.
Solved, I cleaned up the executions history. It had 900k+ lines.
2 Likes
You might want to check the EXECUTIONS_DATA_PRUNE env variable so you do not have to do this manually every time.
1 Like
Did that right after, thanks!
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.