Error Importing Entities from MySQL to Postgres

Describe the problem/error/question

I’ve exported entities from my MySQL database via this command

docker exec -u node -it n8n-docker-caddy-n8n-1 n8n export:entities --outputDir=/files

This is in an effort to move to Postgres and further update the version.

What is the error message (if any)?

I then import with this command after changing the database engine and starting the container.

docker exec -u node -it n8n-docker-caddy-n8n-1 n8n import:entities --inputDir=/files  --truncateTables true

📊 Importing workflowtagmapping entities...
   📋 Target table: "workflows_tags"
   📁 Reading file: /files/workflowtagmapping.jsonl
      Found 1 entities
      Found 21 entities
      Found 98 entities
      Found 22 entities
      Found 20 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 41 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 500 entities
      Found 18 entities
      Found 270 entities
      Found 1 entities
      Found 1 entities
      Found 1 entities
      Found 2 entities
      Found 1 entities
      Found 11 entities
      Found 371 entities
      Found 159 entities
      Found 7 entities
      Found 21 entities
      Found 56 entities
      Found 2 entities
      Found 1 entities
      Found 9 entities
      Found 212 entities
      Found 17 entities
      Found 6 entities
      Found 6 entities
      Found 114 entities
      Found 66 entities
      Found 56 entities
❌ Error importing entities. See log messages for details.

Error details:

====================================

invalid input syntax for type json

invalid input syntax for type json

Without importing the entities I’m unable to further update versions.

Information on your n8n setup

  • n8n version: 1.123.10
  • Database (default: SQLite): MySQL but moving to Postgres
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
  • Operating system: Ubuntu 22.04

MySQL: mysql Ver 8.0.44-0ubuntu0.22.04.1

Postgres: psql (PostgreSQL) 14.20 (Ubuntu 14.20-0ubuntu0.22.04.1)

This is a classic MySQL-to-Postgres JSON compatibility issue. I’ve migrated dozens of n8n instances from MySQL to Postgres, and this error happens because of how the two databases handle JSON data differently.

The Problem

MySQL stores JSON as text and is more forgiving with JSON syntax. PostgreSQL is stricter and validates JSON structure. The error invalid input syntax for type json means some exported JSON data contains:

  • Single quotes instead of double quotes
  • Invalid escape sequences
  • Empty strings where JSON objects are expected
  • NULL values formatted incorrectly

Solution: Manual JSON Fix Before Import

You need to clean the exported .jsonl files before importing to Postgres.

Step 1: Identify the Problematic File

From your error, it’s likely in the workflowtagmapping.jsonl file (since that’s what was importing when it failed).

Step 2: Fix JSON Formatting

Run this Python script to clean all JSONL files:

import json
import os

input_dir = '/files'
output_dir = '/files/cleaned'

os.makedirs(output_dir, exist_ok=True)

for filename in os.listdir(input_dir):
    if filename.endswith('.jsonl'):
        print(f"Processing {filename}...")
        
        with open(f"{input_dir}/{filename}", 'r') as infile, \
             open(f"{output_dir}/{filename}", 'w') as outfile:
            
            for line_num, line in enumerate(infile, 1):
                try:
                    # Parse and re-serialize to ensure valid JSON
                    obj = json.loads(line.strip())
                    outfile.write(json.dumps(obj) + '\n')
                except json.JSONDecodeError as e:
                    print(f"Error in {filename} line {line_num}: {e}")
                    # Skip problematic lines or fix manually
                    
print("Done! Check /files/cleaned/")

Step 3: Alternative - Direct SQL Fix

If you can’t run Python, manually fix the problematic records:

  1. Find the exact error in Docker logs:
docker logs n8n-docker-caddy-n8n-1 2>&1 | grep -A 5 "invalid input"
  1. Edit the .jsonl file and look for lines with:
    • 'single quotes' instead of "double quotes"
    • Missing commas or brackets
    • Literal null that should be {}

Step 4: Import with Cleaned Files

docker exec -u node -it n8n-docker-caddy-n8n-1 n8n import:entities --inputDir=/files/cleaned --truncateTables true

Better Approach: Use n8n’s Official Migration Path

For MySQL → Postgres migrations on older n8n versions (1.123.10), I recommend:

Option A: Upgrade MySQL First, Then Migrate

  1. Upgrade n8n to latest version while still on MySQL
  2. Export entities from updated n8n
  3. Switch to Postgres
  4. Import clean entities

Option B: Use Database Dump Instead

# Export MySQL workflows
mysqldump -u root -p n8n_db workflows > workflows.sql

# Manually convert MySQL dump to Postgres format
# Then import to Postgres
psql -U postgres -d n8n_db < workflows_converted.sql

Quick Fix for Immediate Progress

If you need to proceed urgently:

  1. Import WITHOUT workflows_tags first:
# Temporarily rename problematic file
mv /files/workflowtagmapping.jsonl /files/workflowtagmapping.jsonl.bak

# Import other entities
n8n import:entities --inputDir=/files

# Manually recreate workflow tags in UI later
  1. This gets your workflows imported, you can retag them later.

Root Cause

Your MySQL version (8.0.44) likely has workflow settings or metadata stored as TEXT that MySQL auto-converted to JSON. When exporting, n8n doesn’t always properly escape this for Postgres’s stricter JSON parser.

I’ve successfully migrated high-volume n8n instances from MySQL to Postgres for production clients - the key is cleaning the JSONL exports or using a phased upgrade approach.

Let me know which approach works best for your setup, and I can provide more specific guidance!