Cannot connect to Ollama Chat Model

Hello everyone! I need your help.

I want to connect to Ollama Chat Model, but I receiving the following error message: “Couldn’t connect with these settings
The service refused the connection - perhaps it is offline”, though Ollama is running on localhost.

I would greatly appreciate if you could help me resolve this issue.

Hey @Vahe_Vahanyan hope all is good. Welcome to the community.

Please explain your setup. Specifically mention:

  • whether you are running cloud or self-hosted
  • if your setup is docker or npm
  • how you ollama is installed and running
  • how you are trying to connect to ollama from n8n.

I am running cloud.
I am using the trial version.
I downloaded and installed ollama from the website and added to the windows environmental variables. Then ran the model llama3 in cmd and tried to connect it with the credentials using Ollama chat model node. As a result I get this issue:

If you are running n8n in cloud, you cannot use 127.0.0.1 to reach Ollama, that runs locally on your computer. 127.0.0.1 (or localhost) means “on this machine”, which makes cloud instance of your n8n believe that your are running Ollama on n8n cloud as well (in the same instance as n8n), which is probably isn’t the case.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.