HOW TO: Migrate from SQLite to Postgres

Thought I might share this for whoever needs it, as there is a major lack of info around to achieve this without having to set everything back up (other than workflow and credentials)

This is especially relevant if you have more than 1 user, so you can keep the workflows and credentials that the users had

This is just going to be a quick guide.

  1. Follow the CLI commands to export your workflow and Creds
    CLI commands | n8n Docs

  2. Get yourself an SQLite browser like DB Browser for SQLite.
    Open your SQlite file with this.

  3. Export the tables to csv
    Namely
    -credentials_entity
    -folder
    -project
    -project_relation
    -shared_credentials
    -shared_workflow
    -user
    -webhook_entity
    -workflow_entity

You can also get your workflow histories etc if you like as well,

  1. Reconfigure your N8n to use postgres (supabase is a great option here, for config details click Connect in the top of the database, select connection string and select the type as “PSQL” to get the specifics to use it with N8N.)
    DB_TYPE=postgresdb
    DB_POSTGRESDB_DATABASE=postgres
    DB_POSTGRESDB_HOST=aws-0-eu-centr.supabase.com
    DB_POSTGRESDB_PORT=6543
    DB_POSTGRESDB_USER=postgres.ydw
    DB_POSTGRESDB_PASSWORD=Z7a#W

  2. Start up n8n and let it create all the database tables

  3. Import the workflows and credentials via the CLI commands

  4. import the csv files to the relevant tables in postgres. You need to import project folder before folders

  5. NOTE: You will need to delete the imported shared_workflow and shared_Credentials

  6. Delete the default user that is created in users

  7. in N8N settings set userManagement.isinstanceownersetup = true

  8. Restart the docker and you should be able to log in with your original user/pass and have all your workflows

  9. Enter your license key

Hope this helps someone and saves them the few hours I lost today after having my sqlite crash

6 Likes

Hi, you forgot to remove your dB credentials

@Shadoxity

Reg,
J.

Thanks for the note, those are just sample placeholders though as I figure supabase user may be a bit misleading to someone that isnt familiar with it

1 Like

Thanks for the hints.

I wasn’t able to export to csv and import easily because I kept getting weird errors with the software I was using.

I found one called DBeaver.

I simply connected to the sqlite database and the postgres and exported from one into the other and it worked.

I had to figure out order though because of the relationships and stuff.

But it seems to work.

Thanks.

Ok. This is great.

Now.. how do you persist community nodes?

What Do you mean? It sounds like you are not using a volume for your docker so when it restarts it doesn’t keep anything you gave installed like community nodes.

So fix that up and you should be good

Thanks for this. I’m facing the same relationship challenges you speak of. Would you be able to post the order in which tables must be migrated to work around this? Thanks!

I don’t remember as I didn’t note it down.

But I believe if you export/import and it gives you an error, you just go to the next and then come back.

But I don’t have notes on that since its been so long.

1 Like

I’ll be rocking this process within the next week or so and can let you know what I come up with, if you’re still wondering.

That said, if you got the order figured out already and are willing to share I’d be elated to hear

Much light your way,

Auggy

1 Like

Just jumping in with my own experience migrating from SQLite to Postgres and ultimately what I found to be the easiest method is to spin up a new Docker stack with n8n+Postgres, then connect to both the old SQLite database and the new Postgres database using DBeaver (you’ll have to figure that part out on your own depending on if you’re using Supabase, a Postgres Docker container, etc - for me it was as simple as temporarily opening the Postgres ports up on my host and assigning the Postgres container ports in my docker-compose.)
With both databases open, you can use DBeaver to export directly from SQLite to Postgres. Trying to use the CLI was nothing but a headache. You can migrate everything over right in DBeaver and once you figure out the right order it’s actually relatively painless. Here’s the order that worked for me:

  1. credentials_entity
  2. workflows_entity
  3. folder
  4. user
  5. project
  6. project_relation
  7. tag_entity
  8. workflows_tags
  9. webhook_entity
  10. shared_credentials
  11. shared_workflow

You can specify the export order in DBeaver so you can theoretically just migrate everything in one go. Once the core tables are migrated, delete the default user as mentioned in the original post here, then log in to n8n and make sure everything is as expected. Then you can go back to DBeaver and migrate over whatever else you decide you want (workflow history, statistics, etc).

If you’re using data tables, you’ll need to export the data_table and data_table_columtables as well as the data tables themselves (data_table_user_xxx). I found the easiest thing to do is to log in to the new n8n instance and create a few dummy data tables, then edit those in DBeaver to match your needed table names/structure so that you can export straight to those tables. If you use DBeaver to create those tables, you’ll need to be careful with permissions otherwise the tables could end up being unusable.

Don’t forget the last step mentioned above - re-enter your license key! I spent 20 minutes banging my head against the wall wondering why my folders/projects weren’t showing up only to realize I hadn’t put in my license key so the feature wasn’t activated :person_facepalming:

Hope this helps someone!