In n8n I can't connect ollama chat model. What should I do?

When I select ollama model chat, in the test to connect’ I get following error with “http://localhost:11434” address. What should I do? Also servis is run and OK and its API is working without n8n.

“”"
Couldn’t connect with this setting
“”"

Where did you host your n8n?

If you are using cloud version. You can not just use the localhost domain.

You might need something like ngrok to let your localhost has a domain that can be connected from outside.

Thanks @darrell_tw to answer :pray:

But I ran it in locally.

Ah ok. So the error message is only Couldn’t connect with this setting?

1 Like

When your n8n runs inside a docker you can try to use http://host.docker.internal:11434.
When ollama also runs inside a docker you can use http://ollama:11434, depends on the container name.
If it doesn’t work please share your setup(How you start n8n and ollama)

1 Like

@Franz , I do that but it doesn’t work. Can this photo help?

When http://host.docker.internal:11434 does not work, you can try to add extra_hosts to your docker compose file.
This is probably only necessary when Docker is running on Linux Machines.

services:
  n8n:
    image: n8nio/n8n
    environment: 
        - N8N_COMMUNITY_PACKAGES_ALLOW_MODE_USAGE=true
    container_name: n8n
    ports:
      - 5678:5678
    columes:
      - n8n_data:/home/node/.n8n
    restart: unless-stopped
    extra_hosts:
      - "host.docker.internal:host-gateway"

volumes:
  n8n_data:

If it doesn’t work you can also set network_mode: host in your n8n docker compose file and change the url to http://localhost:11434 or `http://127.0.0.1:11434.

My recommendation is to run both services in a single Docker Compose setup. This way, you can use http://ollama:11434.

version: "3.8"

services:
  ollama:
    image: ollama/ollama
    volumes:
      - ollama:/root/.ollama
    ports:
      - "11434:11434"

  n8n:
    image: n8nio/n8n
    container_name: n8n
    environment:
      - N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
    ports:
      - "5678:5678"
    volumes:
      - n8n_data:/home/node/.n8n
    restart: unless-stopped

volumes:
  ollama:
  n8n_data:

You can start individual services with docker compose up service_name, e.g. docker compose up n8n or docker compose up ollama