Compose file for running task-runner

Describe the problem/error/question

I am trying to run two task runner one for JS and one for Py and I could manage to write this compose file but there are some issue

  • I am not able to run any JS code as node inside of any workflow. No matter what JS code is. IT just loading and loading
  • Despite I define N8N_RUNNERS_AUTH_TOKEN_FILE in my secrets, it is not picking it up and log shows key not found, generating a new key….
  • When I had one external task runner, I has first two error but when I seperate JS and Py , I cant even run it. container crashes

I am using ubuntu server 24 and portainer CE to manage. also nginx for reverse proxy.

When I use task runner mode internal, I have absolutely no issue which means my server and docker and nginx is totally ok and there is an issue with the compose file

Please share your workflow

version: "3.9"  # Compose schema version (Swarm-compatible)

###############################################################################
# Overlay network (attachable so future services/containers can join)
###############################################################################
networks:
  ap_net:                 # Name of the overlay network for this stack
    driver: overlay         # Use Swarm overlay networking
    attachable: true        # Allow other services/containers to attach later

###############################################################################
# Persistent named volumes (Swarm will create these on the node)
###############################################################################
volumes:
  n8n_data:                 # n8n user data, binaries, credentials, etc.
  pg_data:                  # PostgreSQL database cluster (data directory)
  shared_files:             # Shared “drive” mounted into n8n at /data/share

###############################################################################
# External Docker Secrets (create these in Portainer before deploying)
# Keep real values OUT of Git. Marked external so the stack references them.
###############################################################################
secrets:
  n8n_encryption_key:       { external: true }  # Symmetric key for encrypting creds in n8n
  n8n_runners_auth_token:   { external: true }  # Shared token for broker<->runner auth
  db_database:              { external: true }  # Database name (e.g., n8n)
  db_user:                  { external: true }  # Database user (e.g., n8n)
  db_password:              { external: true }  # Database password (strong)

###############################################################################
# Services
###############################################################################
services:

  ###########################################################################
  # PostgreSQL (database)
  ###########################################################################
  postgres:
    image: postgres:18-alpine                # Lightweight Postgres
    networks: [ap_net]                     # Internal overlay net
    volumes:
      - pg_data:/var/lib/postgresql/data     # Persist DB files
    environment:                             # Use *_FILE to read secrets at runtime
      POSTGRES_DB_FILE:       /run/secrets/db_database
      POSTGRES_USER_FILE:     /run/secrets/db_user
      POSTGRES_PASSWORD_FILE: /run/secrets/db_password
    secrets:                                  # Mount only needed secrets
      - db_database
      - db_user
      - db_password
    ports:                                    # Optional: publish DB for pgAdmin
      - target: 5432                          # Container port (Postgres default)
        published: 15432                      # Host port (adjust as desired)
        protocol: tcp
        mode: host                            # Bind on this node only (no ingress mesh)
    deploy:
      restart_policy: { condition: on-failure }
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U $$(cat /run/secrets/db_user)"]
      interval: 30s
      timeout: 5s
      retries: 5
      start_period: 30s

  ###########################################################################
  # n8n (editor + API + broker for task runners)
  ###########################################################################
  n8n:
    image: n8nio/n8n:1.117.3                  # Fixed n8n version
    networks: [ap_net]                      # Internal overlay net
    volumes:
      - n8n_data:/home/node/.n8n              # Persist n8n home (workflows, creds)
      - shared_files:/data/share              # Shared “drive” for users
    ports:                                     # Publish editor/API to host for Nginx
      - target: 5678                           # Container port (n8n HTTP)
        published: 5678                        # Host port for Nginx proxy_pass
        protocol: tcp
        mode: host                             # Bind on this node (no ingress mesh)
    environment:
      # -------------------- URLs (TLS terminated by Nginx) -------------------
      N8N_HOST: n8n.example.com           # Public host used in generated links
      N8N_PORT: 5678                           # Container’s HTTP port
      N8N_PROTOCOL: https                      # Generate https:// links
      N8N_LISTEN_ADDRESS: 0.0.0.0              # Listen on all interfaces
      WEBHOOK_URL:          https://n8n.example.com/
      N8N_EDITOR_BASE_URL:  https://n8n.example.com/
      VUE_APP_URL_BASE_API: https://n8n.example.com/
      # --------------------------- Security ----------------------------------
      N8N_TRUST_PROXY: "true"
      N8N_PROXY_HOPS: "1"
      NODE_ENV: production
      TZ: UTC
      N8N_BLOCK_ENV_ACCESS_IN_NODE: "true"
      N8N_SECURE_COOKIE: "false"
      N8N_SESSION_COOKIE_SAMESITE: lax
      N8N_ENFORCE_SETTINGS_FILE_PERMISSIONS: "true"  # Enforce secure perms for config file
      N8N_DIAGNOSTICS_ENABLED: "false"
      N8N_GIT_NODE_DISABLE_BARE_REPOS: "true"
      N8N_ENCRYPTION_KEY_FILE: /run/secrets/n8n_encryption_key
      # --------------------------- Task Runners (broker) ---------------------
      N8N_RUNNERS_ENABLED: "true"              # Enable task runners [2]
      N8N_RUNNERS_MODE: external               # External runner processes [2]
      N8N_RUNNERS_BROKER_LISTEN_ADDRESS: 0.0.0.0  # Broker bind address [1][2]
      N8N_RUNNERS_BROKER_PORT: 5679            # Broker WebSocket port [1][2]
      N8N_RUNNERS_AUTH_TOKEN_FILE: /run/secrets/n8n_runners_auth_token  # Broker<->runner token [2]
      N8N_PUSH_BACKEND: websocket              # Enable realtime collaboration via WS [1]
      N8N_NATIVE_PYTHON_RUNNER: "true"
      # ----------------------------- Database --------------------------------
      DB_TYPE: postgresdb
      DB_POSTGRESDB_HOST: postgres
      DB_POSTGRESDB_PORT: "5432"
      DB_POSTGRESDB_DATABASE_FILE:  /run/secrets/db_database
      DB_POSTGRESDB_USER_FILE:      /run/secrets/db_user
      DB_POSTGRESDB_PASSWORD_FILE:  /run/secrets/db_password
    secrets:
      - n8n_encryption_key
      - n8n_runners_auth_token
      - db_database
      - db_user
      - db_password
    depends_on: [postgres]
    deploy:
      restart_policy: { condition: on-failure }
    healthcheck:
      test: ["CMD-SHELL", "node -e \"require('http').get('http://localhost:5678/healthz',r=>process.exit(r.statusCode===200?0:1)).on('error',()=>process.exit(1))\""]
      interval: 30s
      timeout: 10s
      retries: 5
      start_period: 30s

  ###########################################################################
  # n8n runner (JavaScript) – executes JS tasks, connects to the broker
  # This image tag has no 'task-runner' CLI; start the process script directly.
  ###########################################################################
  n8n_runner_js:
    image: n8nio/n8n:1.117.3                  # Same version for compatibility
    networks: [ap_net]
    entrypoint: ["/usr/local/bin/node", "/usr/local/lib/node_modules/n8n/dist/task-runners/task-runner-process-js.js"]  # JS runner process
    environment:
      N8N_RUNNERS_ENABLED: "true"             # Enable runner mode [2]
      N8N_RUNNERS_MODE: external              # External runner connects to broker [2]
      N8N_RUNNERS_TASK_BROKER_URI: ws://n8n:5679  # Runner connects here (WS URI) [2]
      N8N_RUNNERS_AUTH_TOKEN_FILE: /run/secrets/n8n_runners_auth_token
      N8N_LOG_LEVEL: debug
      # ----------------------------- Database (same as main) -----------------
      DB_TYPE: postgresdb
      DB_POSTGRESDB_HOST: postgres
      DB_POSTGRESDB_PORT: "5432"
      DB_POSTGRESDB_DATABASE_FILE: /run/secrets/db_database
      DB_POSTGRESDB_USER_FILE:     /run/secrets/db_user
      DB_POSTGRESDB_PASSWORD_FILE: /run/secrets/db_password
    secrets:
      - n8n_runners_auth_token
      - db_database
      - db_user
      - db_password
    depends_on: [n8n]
    deploy:
      restart_policy: { condition: on-failure }

  ###########################################################################
  # Optional: n8n runner (Python) – executes Python tasks
  # Enable only if you want a dedicated Python runner alongside JS.
  ###########################################################################
  n8n_runner_py:
    image: n8nio/n8n:1.117.3
    networks: [ap_net]
    entrypoint: ["/usr/local/bin/node", "/usr/local/lib/node_modules/n8n/dist/task-runners/task-runner-process-py.js"]  # Python runner process
    environment:
      N8N_RUNNERS_ENABLED: "true"
      N8N_RUNNERS_MODE: external
      N8N_RUNNERS_TASK_BROKER_URI: ws://n8n:5679
      N8N_RUNNERS_AUTH_TOKEN_FILE: /run/secrets/n8n_runners_auth_token
      N8N_LOG_LEVEL: debug
  #     # Database
      DB_TYPE: postgresdb
      DB_POSTGRESDB_HOST: postgres
      DB_POSTGRESDB_PORT: "5432"
      DB_POSTGRESDB_DATABASE_FILE: /run/secrets/db_database
      DB_POSTGRESDB_USER_FILE:     /run/secrets/db_user
      DB_POSTGRESDB_PASSWORD_FILE: /run/secrets/db_password
    secrets:
      - n8n_runners_auth_token
      - db_database
      - db_user
      - db_password
    depends_on: [n8n]
    deploy:
      restart_policy: { condition: on-failure }

Information on your n8n setup

  • n8n version: 1.117.3
  • Database (default: SQLite): Postgres
  • n8n EXECUTIONS_PROCESS setting (default: own, main): default
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Portainer, docker swarm
  • Operating system: Ubuntu server 24

I am not sure what else I can try. I approached several ways

Where did you get this N8N_RUNNERS_AUTH_TOKEN_FILE from?!
Try using N8N_RUNNERS_AUTH_TOKEN=anything instead.

I have no idea where you got that from.
The example in the docs shows only one

It is pretty much standard. Adding _FILE for keeping sensitive data. It also mentioned here that n8n supports it

I used for other env var and looks good. Only this one. Also if I add it manually like the example, still I have error with running JS code node. And if I separate like what I said, I can not even run the container

Ref. Task runner environment variables | n8n Docs

1 Like

Ah, nice! Good to know. I haven’t used it tbh.

Can you try removing the two services and replacing them with just one like this:

 task-runners:
    image: n8nio/runners:1.117.3
    container_name: n8n-runners
    networks: [ap_net]
    environment:
      - N8N_RUNNERS_TASK_BROKER_URI=http://n8n:5679
      - N8N_RUNNERS_AUTH_TOKEN_FILE=/run/secrets/n8n_runners_auth_token
      - N8N_RUNNERS_LAUNCHER_LOG_LEVEL=debug
    secrets:
      - n8n_runners_auth_token
    depends_on:
      - n8n
    deploy:
      restart_policy: { condition: on-failure }

Also, what do the logs say?

did that as well, I tired to make it simple with one task runner, instead of two. however issue is the same.I can get into the n8n UI and all good except when it comes to JS code node, it just keeps loading until it time out. Logs from runner is this.

2025-10-31T15:44:44.498Z | info | No encryption key found - Auto-generating and saving to: /home/node/.n8n/config {"file":"instance-settings.js","function":"loadOrCreate"}
2025-10-31T15:44:44.504Z | warn | Permissions 0644 for n8n settings file /home/node/.n8n/config are too wide. This is ignored for now, but in the future n8n will attempt to change the permissions automatically. To automatically enforce correct permissions now set N8N_ENFORCE_SETTINGS_FILE_PERMISSIONS=true (recommended), or turn this check off set N8N_ENFORCE_SETTINGS_FILE_PERMISSIONS=false. {"file":"instance-settings.js","function":"ensureSettingsFilePermissions"}
2025-10-31T15:44:44.546Z | debug | Received CLI command {"execPath":"/usr/local/bin/node","scriptPath":"/usr/local/bin/n8n","args":[],"flags":{},"file":"cli-parser.js","function":"parse"}
2025-10-31T15:44:44.552Z | info | Initializing n8n process {"file":"start.js","function":"init"}
2025-10-31T15:44:44.689Z | debug | Lazy-loading nodes and credentials from n8n-nodes-base {"nodes":484,"credentials":386,"file":"lazy-package-directory-loader.js","function":"loadAll"}
2025-10-31T15:44:44.701Z | debug | Lazy-loading nodes and credentials from @n8n/n8n-nodes-langchain {"nodes":108,"credentials":24,"file":"lazy-package-directory-loader.js","function":"loadAll"}
2025-10-31T15:44:44.914Z | info | n8n ready on ::, port 5678 {"file":"abstract-server.js","function":"init"}
2025-10-31T15:44:44.930Z | debug | Initializing AuthRolesService... {"file":"auth.roles.service.js","function":"init"}
2025-10-31T15:44:44.938Z | debug | No scopes to update. {"file":"auth.roles.service.js","function":"syncScopes"}
2025-10-31T15:44:44.939Z | debug | No obsolete scopes to delete. {"file":"auth.roles.service.js","function":"syncScopes"}
2025-10-31T15:44:44.961Z | debug | No global roles to update. {"file":"auth.roles.service.js","function":"syncRoles"}
2025-10-31T15:44:44.961Z | debug | No project roles to update. {"file":"auth.roles.service.js","function":"syncRoles"}
2025-10-31T15:44:44.961Z | debug | No credential roles to update. {"file":"auth.roles.service.js","function":"syncRoles"}
2025-10-31T15:44:44.961Z | debug | No workflow roles to update. {"file":"auth.roles.service.js","function":"syncRoles"}
2025-10-31T15:44:44.961Z | debug | AuthRolesService initialized successfully. {"file":"auth.roles.service.js","function":"init"}
2025-10-31T15:44:44.991Z | info | n8n Task Broker ready on 127.0.0.1, port 5679 {"file":"task-broker-server.js","function":"setupHttpServer"}
2025-10-31T15:44:45.034Z | warn |

{"file":"deprecation.service.js","function":"warn"}
2025-10-31T15:44:45.039Z | debug | [license SDK] initializing for deviceFingerprint 0de4334bf6f76ae0f54dd {"scopes":["license"],"file":"LicenseManager.js","function":"log"}
2025-10-31T15:44:45.046Z | info | [license SDK] Skipping renewal on init: license cert is not initialized {"scopes":["license"],"file":"LicenseManager.js","function":"log"}
2025-10-31T15:44:45.046Z | debug | License initialized {"scopes":["license"],"file":"license.js","function":"init"}
2025-10-31T15:44:45.046Z | debug | Querying database for waiting executions {"scopes":["waiting-executions"],"file":"wait-tracker.js","function":"getWaitingExecutions"}
2025-10-31T15:44:45.048Z | debug | Started tracking waiting executions {"scopes":["waiting-executions"],"file":"wait-tracker.js","function":"startTracking"}
2025-10-31T15:44:45.048Z | debug | Wait tracker init complete {"file":"start.js","function":"init"}
2025-10-31T15:44:45.048Z | debug | Loading overwrite credentials from static envvar {"file":"credentials-overwrites.js","function":"init"}
2025-10-31T15:44:45.048Z | debug | Credentials overwrites init complete {"file":"start.js","function":"init"}
2025-10-31T15:44:45.051Z | debug | Binary data service init complete {"file":"start.js","function":"init"}
2025-10-31T15:44:45.052Z | debug | Data deduplication service init complete {"file":"start.js","function":"init"}
2025-10-31T15:44:45.052Z | debug | External hooks init complete {"file":"start.js","function":"init"}
2025-10-31T15:44:45.052Z | debug | Workflow history init complete {"file":"start.js","function":"init"}
2025-10-31T15:44:45.071Z | debug | Test runner cleanup complete {"file":"start.js","function":"init"}
2025-10-31T15:44:46.012Z | debug | Started flushing timer {"scopes":["insights"],"file":"insights-collection.service.js","function":"startFlushingTimer"}
2025-10-31T15:44:46.013Z | debug | Started compaction timer {"scopes":["insights"],"file":"insights-compaction.service.js","function":"startCompactionTimer"}
2025-10-31T15:44:46.013Z | debug | Started pruning timer {"scopes":["insights"],"file":"insights-pruning.service.js","function":"startPruningTimer"}
2025-10-31T15:44:46.014Z | debug | Initialized module "insights" {"file":"module-registry.js","function":"initModules"}
2025-10-31T15:44:46.014Z | debug | Skipped init for unlicensed module "external-secrets" {"file":"module-registry.js","function":"initModules"}
2025-10-31T15:44:46.017Z | debug | Initialized module "community-packages" {"file":"module-registry.js","function":"initModules"}
2025-10-31T15:44:46.030Z | debug | Initialized module "data-table" {"file":"module-registry.js","function":"initModules"}
2025-10-31T15:44:46.030Z | debug | Skipped init for unlicensed module "provisioning" {"file":"module-registry.js","function":"initModules"}
2025-10-31T15:44:46.728Z | debug | OIDC login is disabled. {"file":"oidc.service.ee.js","function":"init"}
2025-10-31T15:44:46.754Z | debug | Registered a "reload-license" event handler on License#reload {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.754Z | debug | Registered a "relay-execution-lifecycle-event" event handler on Push#handleRelayExecutionLifecycleEvent {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.755Z | debug | Registered a "reload-overwrite-credentials" event handler on CredentialsOverwrites#reloadOverwriteCredentials {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.755Z | debug | Registered a "clear-test-webhooks" event handler on TestWebhooks#handleClearTestWebhooks {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.755Z | debug | Registered a "display-workflow-activation" event handler on ActiveWorkflowManager#handleDisplayWorkflowActivation {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.755Z | debug | Registered a "display-workflow-deactivation" event handler on ActiveWorkflowManager#handleDisplayWorkflowDeactivation {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.755Z | debug | Registered a "display-workflow-activation-error" event handler on ActiveWorkflowManager#handleDisplayWorkflowActivationError {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.755Z | debug | Registered a "add-webhooks-triggers-and-pollers" event handler on ActiveWorkflowManager#handleAddWebhooksTriggersAndPollers {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.755Z | debug | Registered a "remove-triggers-and-pollers" event handler on ActiveWorkflowManager#handleRemoveTriggersAndPollers {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.755Z | debug | Registered a "restart-event-bus" event handler on MessageEventBus#restart {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.755Z | debug | Registered a "response-to-get-worker-status" event handler on WorkerStatusService#handleWorkerStatusResponse {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.756Z | debug | Registered a "community-package-update" event handler on CommunityPackagesService#handleInstallEvent {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.756Z | debug | Registered a "community-package-install" event handler on CommunityPackagesService#handleInstallEvent {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.756Z | debug | Registered a "community-package-uninstall" event handler on CommunityPackagesService#handleUninstallEvent {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.756Z | debug | Registered a "reload-saml-config" event handler on SamlService#reload {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.756Z | debug | Registered a "reload-oidc-config" event handler on OidcService#reload {"scopes":["pubsub"],"file":"pubsub.registry.js","function":"init"}
2025-10-31T15:44:46.788Z | debug | Initializing event bus... {"file":"message-event-bus.js","function":"initialize"}
2025-10-31T15:44:46.791Z | debug | Initializing event writer {"file":"message-event-bus.js","function":"initialize"}
2025-10-31T15:44:46.794Z | debug | Checking for unsent event messages {"file":"message-event-bus.js","function":"initialize"}
2025-10-31T15:44:46.794Z | debug | Start logging into /home/node/.n8n/n8nEventLog.log  {"file":"message-event-bus.js","function":"initialize"}
2025-10-31T15:44:46.797Z | debug | MessageEventBus initialized {"file":"message-event-bus.js","function":"initialize"}
2025-10-31T15:44:46.800Z | info | Version: 1.117.3 {"file":"abstract-server.js","function":"start"}
2025-10-31T15:44:46.800Z | debug | Server ID: main-be95ed75e30f {"file":"server.js","function":"start"}
2025-10-31T15:44:46.804Z | debug | Soft-deletion every 60 minutes {"scopes":["pruning"],"file":"executions-pruning.service.js","function":"scheduleRollingSoftDeletions"}
2025-10-31T15:44:46.804Z | debug | Hard-deletion in next 15 minutes {"scopes":["pruning"],"file":"executions-pruning.service.js","function":"scheduleNextHardDeletion"}
2025-10-31T15:44:46.804Z | debug | Started pruning timers {"scopes":["pruning"],"file":"executions-pruning.service.js","function":"startPruning"}
2025-10-31T15:44:46.813Z [Rudder] debug: in flush
2025-10-31T15:44:46.829Z [Rudder] debug: no existing flush timer, creating new one
2025-10-31T15:44:46.887Z | info |
Editor is now accessible via:
http://localhost:5678  {"file":"base-command.js","function":"log"}
2025-10-31T15:44:56.829Z [Rudder] debug: in flush
2025-10-31T15:44:56.829Z [Rudder] debug: cancelling existing flushTimer...

Not much I could get from the log
My personal AI assistance saying :slight_smile:
logs — they show your “runner” container is actually starting a full n8n server, not a runner process.

Okay, it seems the runner service is running.

now what do the logs say in the runner service?

that was runner

Are you sure?!
Your logs don’t seem to be the runner logs, it’ the main n8n service.

Here are mine, for example:

I can try to run it again tomorrow or Monday as this is in business device. Maybe you could run the compose I sent?
2 runner one for JS one for Py ?