Give n8n has deprecated support for mysql/mariadb and soon to remove it, users are now forced to migrate to postgresql. However you provide no instructions nor guide on how to do so. Please provide a guide/instructions/tool to be able to migrate our data from mariadb to postgresql so that we can continue to use n8n.
Thanks a lot. That is planned. We deprecated it already now before we have a guide, to make sure that not more people start using those databases and then have to migrate later.
Thanks for the clarification. Do you have a planned ETA for the guide?
(I’m just anxious and hate letting deprecated things hang around)
@netris We were on same boat.
We end up spinning a new n8n docker instnace from scratch with postgresql database.
Export all the workflows and credentials via n8n cli from the old n8n instance and import them to new one.
Sure we lost all saved executions logs etc but it was the quickest thing to do.
Did the credentials export also export the users, or was it just stored Credentials?
I have some friends who have executions that they created, and I need to make sure they don’t have to re-sign up and then figure out how to give them back their workflows.
We didn’t have additional users I’m afraid so, i can not answer you that.
@jan Any thoughts on when this guide should be available?
@Mulen Can you elaborate on how to have 2 docker instances running at the same time?
I understand how to export workflows and credentials from the old instance. Then I would create a new N8N instance with postgressql, but when I start it I’m not sure how to make the UI go to the “new” instance instead of the old? Or how to tell them apart in Docker so I can delete the old one once I’m done with it? And the file locations in the Docker Compose, like:
Is it OK to use the same ones as the original N8N instance?
Apologies, as Docker is my kryptonite.
I ended up asking ChatGPT for help through the process, and ended up just transitioning my existing SQLite N8N to use Postgresql. Here’s what I did:
- Create a new docker container running Postgresql
- Exported all my workflows and credentials from my existing N8N via the command line, to a folder on my server
- Imported all the workflows and credentials to the Postgresql database
- Copy my existing docker-compose.yml to docker-compose.old
- Edit the docker-compose.yml with the Postgresql database information, removing the SQlite information
- docker compose down
- docker compose up -d
And it worked! I lost execution history this way, but I didn’t need it, and it was simpler to me than creating a new docker container for a new N8N and then dealing with making the domain point to the correct N8N among the two, etc.
I’m leaving my existing SQLite database file on the server for a few days and then I’ll delete it once I’m certain everything is performing as it should with Postgresql.
Thanks for sharing your experience in transitioning from SQLite to Postgresql with N8N. I’m relatively new to DevOps, and your steps are quite insightful. Could you provide a bit more detail on the following steps:
- Creating a new docker container running Postgresql: Were there any specific configurations you needed to set up?
- Exporting workflows and credentials from existing N8N to a folder on your server: Could you elaborate on the specific command or process you used for this?
- Importing workflows and credentials to the Postgresql database: What commands or tools did you use for this step?
I appreciate your help, and a bit more detail would be incredibly valuable.
Welcome to the community
To quickly go through your questions…
- Set up Postgres however you normally would, We don’t really mind how you do it as long as we can connect to it.
2 & 3. You can find instructions on how to use the CLI and the commands available here: CLI commands | n8n Docs
Thank you for the warm welcome and the guidance! I’ll explore the CLI commands in the n8n Docs to better understand how to manage the migration process.
I’m specifically looking to export and import workflows to a specific user via the CLI. Is there a command or approach you recommend for achieving this? My goal is to streamline the process and ensure the workflows are associated with the correct user during the import.
Exporting workflows from the CLI is done for all users not select users. Oddly enough though I am not sure what happens if you exported all the workflows then imported them as we don’t have an option to export / import users so it might add them all to the owner account.
I tried exporting all workflows and credentials from VPS A, then copied them to VPS B. Afterward, I moved the backup folder to the Docker environment. However, when I ran the import CLI, the result showed 0 workflows and 0 credentials imported.
I’m wondering if there’s anything specific I need to consider or modify to ensure a successful import.
That should work it is how I normally do it, Which command did you run for the import and were the files in the same directory?
I’ve utilized the following commands to import workflows and credentials:
n8n import:workflow --separate --input=backups/latest/
n8n import:credentials --separate --input=backups/latest/
In your container which folder are you in when you run that command and did you access the container using the root user or the node user?
When I executed the command, I accessed the container using the root user. The command was run within the default root directory of the container.
Is that the container root user or the OS root user? In the container we use the node user so if you are changing the user it might be worth making sure you use the node user.