I created this workflow where I update several tables from one database to another. I’d like to know if there’s a best practice for handling this task.
Another point is regarding database backup. Can I perform a backup of the entire database, or should I continue selecting tables one by one, as I did before?
No real best practice for syncing tables between databases just make sure you test your workflow to see what happens if you delete something would you expect that to also be deleted on the other side?
When ti comes to database backups that is also up to you and depends on the tools you are using. I know some database backup tools will do table and row level so you can restore but personally for my install I don’t back up my database at all and I just export the workflows to Git a few times a day.
Thank you for your quick response Jon.
In my case, I’m just selecting data from a specific table and performing update/insert in the same table but in another database.
My need now is to perform this update/insert at the database level, without the need to list table by table as I did in the print below.
I don’t think we have an easy way to do that, Maybe you could do something creative with the execute query option but really what you are after is the mysqldump command. Maybe you can SSH to your database host, Run the mysql dump command then use SFTP to get the backup into n8n and do any other processing.