Ollama model (LLAMA2) can not be connected

Describe the problem/error/question

I’m Ollama Model widget, I set the credentials successfully. Ollama is up and running. But LLama2 model is not accessible. I tried with orca-mini, but didn’t work.
Secondly, I can’t get a list of the models either.
Thank you!

What is the error message (if any)?

Error in sub-node ‘Ollama Model‘
fetch failed Open node

Issues:

  • There was a problem loading the parameter options from server: “The service refused the connection - perhaps it is offline”

ERROR:

fetch failed

Details

Time

14.11.2023 17:18:49

Cause
Data below may contain sensitive information. Proceed with caution when sharing.

{

“cause”: {

“errno”: -111,

“code”: “ECONNREFUSED”,

“syscall”: “connect”,

“address”: “127.0.0.1”,

“port”: 11434

}

}

Ollama Model

Parameters


Docs

Credential to connect with

Model

Fixed

Expression

Options

No properties

I wish this node would…

Error in sub-node ‘Ollama Model‘

fetch failed Open node

Please share your workflow

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

{
“meta”: {
“instanceId”: “2bfe0ce772eaa96b1f6489371198968e38fd1e5687c6a8f3546ce8b64a77afce”
},
“nodes”: [
{
“parameters”: {
“options”: {}
},
“id”: “d0522b27-2f02-4796-a4ac-0a7e123c6571”,
“name”: “Ollama Model”,
“type”: “@n8n/n8n-nodes-langchain.lmOllama”,
“typeVersion”: 1,
“position”: [
1340,
540
],
“credentials”: {
“ollamaApi”: {
“id”: “x6TQuQB3NqJBcihn”,
“name”: “Ollama account”
}
}
}
],
“connections”: {}
}

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hey @kmkarakaya,

Welcome to the community :raised_hands:

How are you running n8n? I have seen similar issues to this when n8n is being ran in docker as the 127.0.0.1 address is local to the container not the host machine, In cases like this setting the correct IP address tends to get it working.

1 Like

Hello! I’m running a local copy on a docker as you guess.

I tried running n8n like below but couldn’t access Ollama :frowning:

docker run -it --rm --name n8n -p 5678:5678 -v ~/.n8n:/home/node/.n8n –add-host 172.20.57.141:host-gateway docker.n8n.io/n8nio/n8n:ai-beta

and

docker run -it --rm --name n8n -p 5678:5678 -v ~/.n8n:/home/node/.n8n –add-host host.dock
er.internal:host-gateway
docker.n8n.io/n8nio/n8n:ai-beta

How can I resolve the address issue?
Thank you in advance

Hey @kmkarakaya,

What address do you have Ollama configured to listen on?

1 Like

http://localhost:11434/

Hey @kmkarakaya,

You will need to configure Ollama to listen on an address that can be connected to from inside a container. Failing that you might be able to get away with something like 172.17.0.1 but that would depend on configuration.

I tried these options but still, OLLAMA is not accessible

docker run –add-host=host.docker.internal:host-gateway -it --rm --name n8n -p 5678:5678 -v ~/.n8n:/home/node/.n8n docker.n8n.io/n8nio/n8n:ai-beta

docker run –add-host=host.docker.internal:172.17.0.1 -it --rm --name n8n -p 5678:5678 -v
~/.n8n:/home/node/.n8n docker.n8n.io/n8nio/n8n:ai-beta

ERROR: Couldn’t connect with these settings Request failed with status code 403

Anybody can help me?

  • Using n8n:ai-beta on WSL2 DockerDesktop and Ollama

I can access Ollama:
kmkarakaya@DESKTOP-AMT61DR:~$ curl -X POST http://localhost:11434/api/generate -d ‘{
“model”: “llama2”,
“prompt”: “[INST] why is the sky blue? [/INST]”,
“raw”: true,
“stream”: false
}’
{“model”:“llama2”,“created_at”:“2023-11-15T20:04:53.31877153Z”,“response”:" The sky appears blue because of a phenomenon called Rayleigh scattering, which occurs when sunlight enters Earth’s atmosphere. Hinweis: This answer is a simplified explanation and does not take into account all the complexities of the atmosphere and light scattering.\n\nWhen sunlight enters the atmosphere, it encounters tiny molecules…}

Hey @kmkarakaya,

If you are still using localhost or 127.0.0.1 it won’t work as that is local to the machine you run it on so on your desktop that will be localhost on that but in a container it will be local to the container.

You will neeed to set the OLLAMA_HOST environment variable for Ollama to the address you want to use. I would start with OLLAMA_HOST=0.0.0.0 which will allow it to bind to all addresses which should then allow it to work using the IP of your desktop or the container for ollama if that is also in Docker.

You can find more discussion about this here: Allow listening on all local interfaces · Issue #703 · jmorganca/ollama · GitHub

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.

When both services run in Docker container, you need to makes sure they are connected to the same network to be able to communicate with each other. Typically when running both locally it is done automatically with the default network called “bridge”. It is also possible to make sure both services are running on the same network by providing the corresponding option when the container is created. You could also create your own custom network.

For example,

docker network create --driver bridge n8n_network

docker run  --network n8n_network  ...  --name n8n docker.n8n.io/n8nio/n8n:latest

docker run  --network n8n_network  ...  -p 11434:11434 --name ollama ollama/ollama

Once they are both on the same network, you can use the container name as the host. In the above example, the container name was called “ollama” and it listens on the port 11434. Therefore, to set up credentials you would use the Base URL http://ollama:11434.

Alternatively, if no name given to the container, you can check the container IP address (given the network name we provided in the above example “n8n_network”):

# check all the networks
docker network ls

# check the containers connected to your network
docker network inspect n8n_network
>>>
        "Containers": {
            "0ad64eeaa1845d254a0e26c672912b1d36fe1db91b280ca352d371e5e62cf38d": {
                "Name": "ollama",
                "EndpointID": "7c7932518a0376077435be0e0708999e56ae8fc7ad3f3af990b82da27f7e9390",
                "MacAddress": "02:42:ac:12:00:03",
                "IPv4Address": "172.18.0.3/16",
                "IPv6Address": ""
            }

Note the IP adress of “ollama” container is 172.18.0.3. Hence the below should also work.