Inconsistency while importing workflows & credentials

Issue

While importing workflows & credentials into a new docker container only partial workflows & credentials get imported and not all of them present in the export json. Also, on re-import on the new docker container, duplicates of the workflows & credentials imported previously get created with different workflow ids

Information on the n8n setup

  • n8n version: 0.191.1
  • Database you’re using: postgress 13
  • Running n8n via: Docker compose with workers in different containers

Hi @hansenquadros, welcome to the community!

I am very sorry to hear you’re having trouble. Any chance you can provide an example of how to reproduce this (share a workflow which is only imported partially and the export/import commands used here)?

In Fetch Test Workflow python script conains

Using Python GitLab API
f_flow = project.files.get(file_path=‘n8n_flows.json’, ref=‘n8n_flow_backups’)
f_auth = project.files.get(file_path=‘n8n_flows_auth.json’, ref=‘n8n_flow_backups’)

with open(“/data/prod/n8n_flows.json”, “wb”) as outfile:
outfile.write(f_flow.decode())

with open(“/data/prod/n8n_flows_auth.json”, “wb”) as outfile:
outfile.write(f_auth.decode())

In Modify Test to Prod Workflow python script conains

with open(‘/data/prod/n8n_flows.json’, ‘r’) as file :
test_workflow = file.read()

replace_string_patterns = [
{
‘old’: ‘(Test Flow)’,
‘new’: ‘(Prod Flow)’
}, …
]

for pattern in replace_string_patterns:
test_workflow = test_workflow.replace(pattern[‘old’], pattern[‘new’])

prod_workflow_json = json.loads(test_workflow)

with open(“/data/prod/n8n_flows_prod.json”, “w”) as outfile:
outfile.write(test_workflow)

Thanks @hansenquadros. any chance you can share a workflow that isn’t imported as expected when using n8n import:workflow --input=/data/prod/n8n_flows.json? And can you confirm how exactly you have exported it in the first place?

Export workflow

In Push to GitLab

Using Python GitLab API
data = {
‘branch’: ‘n8n_flow_backups’,
‘commit_message’: str(datetime.datetime.now().date()),
‘actions’: [
{
# Binary files need to be base64 encoded
‘action’: ‘update’,
‘file_path’: ‘n8n_flows.json’,
‘content’: base64.b64encode(open(‘/data/test/n8n_flows.json’,‘rb’).read()).decode(),
‘encoding’: ‘base64’
},
{
# Binary files need to be base64 encoded
‘action’: ‘update’,
‘file_path’: ‘n8n_flows_auth.json’,
‘content’: base64.b64encode(open(‘/data/test/n8n_flows_auth.json’,‘rb’).read()).decode(),
‘encoding’: ‘base64’
}
]
}

commit = project.commits.create(data)

Hey @hansenquadros, I tried exporting the example workflow you have provided using n8n export:workflow --all --output=./test.json against a test instance running [email protected] This has worked fine:

I have then deleted this workflow from my instance, before re-importing it again which has also worked:

image

I can access the imported workflow in n8n as expected and it doesn’t seem to be corrupted in any way:

So I am afraid I can’t reproduce the problem based on your description.

Hey @MutedJam, got the issue solved by moving from Postgres DB to MySQL DB. I feel the issue was caused because of Postgres’s strict auto increment on workflows import that was changing worflow ids.

1 Like

Glad to hear you solved it, thanks so much for confirming! As mentioned, I couldn’t reproduce this myself, but I’ll add it to our internal ideas list for a closer look by the product and engineering teams going forward.