Ollama model (LLAMA2) can not be connected

Describe the problem/error/question

I’m Ollama Model widget, I set the credentials successfully. Ollama is up and running. But LLama2 model is not accessible. I tried with orca-mini, but didn’t work.
Secondly, I can’t get a list of the models either.
Thank you!

What is the error message (if any)?

Error in sub-node ‘Ollama Model‘
fetch failed Open node

Issues:

  • There was a problem loading the parameter options from server: “The service refused the connection - perhaps it is offline”

ERROR:

fetch failed

Details

Time

14.11.2023 17:18:49

Cause
Data below may contain sensitive information. Proceed with caution when sharing.

{

“cause”: {

“errno”: -111,

“code”: “ECONNREFUSED”,

“syscall”: “connect”,

“address”: “127.0.0.1”,

“port”: 11434

}

}

Ollama Model

Parameters


Docs

Credential to connect with

Model

Fixed

Expression

Options

No properties

I wish this node would…

Error in sub-node ‘Ollama Model‘

fetch failed Open node

Please share your workflow

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

{
“meta”: {
“instanceId”: “2bfe0ce772eaa96b1f6489371198968e38fd1e5687c6a8f3546ce8b64a77afce”
},
“nodes”: [
{
“parameters”: {
“options”: {}
},
“id”: “d0522b27-2f02-4796-a4ac-0a7e123c6571”,
“name”: “Ollama Model”,
“type”: “@n8n/n8n-nodes-langchain.lmOllama”,
“typeVersion”: 1,
“position”: [
1340,
540
],
“credentials”: {
“ollamaApi”: {
“id”: “x6TQuQB3NqJBcihn”,
“name”: “Ollama account”
}
}
}
],
“connections”: {}
}

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hey @kmkarakaya,

Welcome to the community :raised_hands:

How are you running n8n? I have seen similar issues to this when n8n is being ran in docker as the 127.0.0.1 address is local to the container not the host machine, In cases like this setting the correct IP address tends to get it working.

1 Like

Hello! I’m running a local copy on a docker as you guess.

I tried running n8n like below but couldn’t access Ollama :frowning:

docker run -it --rm --name n8n -p 5678:5678 -v ~/.n8n:/home/node/.n8n –add-host 172.20.57.141:host-gateway docker.n8n.io/n8nio/n8n:ai-beta

and

docker run -it --rm --name n8n -p 5678:5678 -v ~/.n8n:/home/node/.n8n –add-host host.dock
er.internal:host-gateway
docker.n8n.io/n8nio/n8n:ai-beta

How can I resolve the address issue?
Thank you in advance

Hey @kmkarakaya,

What address do you have Ollama configured to listen on?

1 Like

http://localhost:11434/

Hey @kmkarakaya,

You will need to configure Ollama to listen on an address that can be connected to from inside a container. Failing that you might be able to get away with something like 172.17.0.1 but that would depend on configuration.

I tried these options but still, OLLAMA is not accessible

docker run –add-host=host.docker.internal:host-gateway -it --rm --name n8n -p 5678:5678 -v ~/.n8n:/home/node/.n8n docker.n8n.io/n8nio/n8n:ai-beta

docker run –add-host=host.docker.internal:172.17.0.1 -it --rm --name n8n -p 5678:5678 -v
~/.n8n:/home/node/.n8n docker.n8n.io/n8nio/n8n:ai-beta

ERROR: Couldn’t connect with these settings Request failed with status code 403

Anybody can help me?

  • Using n8n:ai-beta on WSL2 DockerDesktop and Ollama

I can access Ollama:
kmkarakaya@DESKTOP-AMT61DR:~$ curl -X POST http://localhost:11434/api/generate -d ‘{
“model”: “llama2”,
“prompt”: “[INST] why is the sky blue? [/INST]”,
“raw”: true,
“stream”: false
}’
{“model”:“llama2”,“created_at”:“2023-11-15T20:04:53.31877153Z”,“response”:" The sky appears blue because of a phenomenon called Rayleigh scattering, which occurs when sunlight enters Earth’s atmosphere. Hinweis: This answer is a simplified explanation and does not take into account all the complexities of the atmosphere and light scattering.\n\nWhen sunlight enters the atmosphere, it encounters tiny molecules…}

Hey @kmkarakaya,

If you are still using localhost or 127.0.0.1 it won’t work as that is local to the machine you run it on so on your desktop that will be localhost on that but in a container it will be local to the container.

You will neeed to set the OLLAMA_HOST environment variable for Ollama to the address you want to use. I would start with OLLAMA_HOST=0.0.0.0 which will allow it to bind to all addresses which should then allow it to work using the IP of your desktop or the container for ollama if that is also in Docker.

You can find more discussion about this here: Allow listening on all local interfaces · Issue #703 · jmorganca/ollama · GitHub