If you’re using a sql database for your n8n instances, you can connect both of them and use the Postgres node to create a workflow that reads from one db data_table and writes to the other,
You can also use the Data Table node to create any missing columns beforehand,
Alternatively, if you’re comfortable with advanced SQL queries, that approach should work as well within the Postgres node.
yea somehow, I’d just call it full control, since there isn’t an easier solution AFAIK,
The issue with the current “Import CSV” option is that id, createdAt, updatedAt are reserved columns so when importing, you’ll lose the original values and be forced to rename those columns, which makes the result messy duplicated (3) coulmns if you’re using these columns in your workflow,
So IMO because of that, I’d go with a direct database insert to get an import that matches the original export exactly..
I agree with @mohamed3nan here. Please take note of the limitations and intended use for data tables. You’re better off using something like Supabase for data storage. Personally I only use data tables to store small transactions temporary data and not permanent records.