Inconsistency while importing workflows & credentials

Issue

While importing workflows & credentials into a new docker container only partial workflows & credentials get imported and not all of them present in the export json. Also, on re-import on the new docker container, duplicates of the workflows & credentials imported previously get created with different workflow ids

Information on the n8n setup

  • n8n version: 0.191.1
  • Database you’re using: postgress 13
  • Running n8n via: Docker compose with workers in different containers

Hi @hansenquadros, welcome to the community!

I am very sorry to hear you’re having trouble. Any chance you can provide an example of how to reproduce this (share a workflow which is only imported partially and the export/import commands used here)?

In Fetch Test Workflow python script conains

Using Python GitLab API
f_flow = project.files.get(file_path=‘n8n_flows.json’, ref=‘n8n_flow_backups’)
f_auth = project.files.get(file_path=‘n8n_flows_auth.json’, ref=‘n8n_flow_backups’)

with open(“/data/prod/n8n_flows.json”, “wb”) as outfile:
outfile.write(f_flow.decode())

with open(“/data/prod/n8n_flows_auth.json”, “wb”) as outfile:
outfile.write(f_auth.decode())

In Modify Test to Prod Workflow python script conains

with open(‘/data/prod/n8n_flows.json’, ‘r’) as file :
test_workflow = file.read()

replace_string_patterns = [
{
‘old’: ‘(Test Flow)’,
‘new’: ‘(Prod Flow)’
}, …
]

for pattern in replace_string_patterns:
test_workflow = test_workflow.replace(pattern[‘old’], pattern[‘new’])

prod_workflow_json = json.loads(test_workflow)

with open(“/data/prod/n8n_flows_prod.json”, “w”) as outfile:
outfile.write(test_workflow)

Thanks @hansenquadros. any chance you can share a workflow that isn’t imported as expected when using n8n import:workflow --input=/data/prod/n8n_flows.json? And can you confirm how exactly you have exported it in the first place?

Export workflow

In Push to GitLab

Using Python GitLab API
data = {
‘branch’: ‘n8n_flow_backups’,
‘commit_message’: str(datetime.datetime.now().date()),
‘actions’: [
{
# Binary files need to be base64 encoded
‘action’: ‘update’,
‘file_path’: ‘n8n_flows.json’,
‘content’: base64.b64encode(open(‘/data/test/n8n_flows.json’,‘rb’).read()).decode(),
‘encoding’: ‘base64’
},
{
# Binary files need to be base64 encoded
‘action’: ‘update’,
‘file_path’: ‘n8n_flows_auth.json’,
‘content’: base64.b64encode(open(‘/data/test/n8n_flows_auth.json’,‘rb’).read()).decode(),
‘encoding’: ‘base64’
}
]
}

commit = project.commits.create(data)

Hey @hansenquadros, I tried exporting the example workflow you have provided using n8n export:workflow --all --output=./test.json against a test instance running [email protected]. This has worked fine:

I have then deleted this workflow from my instance, before re-importing it again which has also worked:

image

I can access the imported workflow in n8n as expected and it doesn’t seem to be corrupted in any way:

So I am afraid I can’t reproduce the problem based on your description.

Hey @MutedJam, got the issue solved by moving from Postgres DB to MySQL DB. I feel the issue was caused because of Postgres’s strict auto increment on workflows import that was changing worflow ids.

1 Like

Glad to hear you solved it, thanks so much for confirming! As mentioned, I couldn’t reproduce this myself, but I’ll add it to our internal ideas list for a closer look by the product and engineering teams going forward.

We are facing similar issue. However we cannot move to Mysql easily.
Can this issue be resolved by core team for Postgres and other SQL stores?

Hi @prashant, which version of n8n are you using? This PostgreSQL problem was resolved with version [email protected]: n8n/CHANGELOG.md at master · n8n-io/n8n · GitHub

If you are still facing any issues here, could you open a new thread with detailed steps on how to reproduce these?

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.