I want to connect to Ollama Chat Model, but I receiving the following error message: “Couldn’t connect with these settings
The service refused the connection - perhaps it is offline”, though Ollama is running on localhost.
I would greatly appreciate if you could help me resolve this issue.
I am running cloud.
I am using the trial version.
I downloaded and installed ollama from the website and added to the windows environmental variables. Then ran the model llama3 in cmd and tried to connect it with the credentials using Ollama chat model node. As a result I get this issue:
If you are running n8n in cloud, you cannot use 127.0.0.1 to reach Ollama, that runs locally on your computer. 127.0.0.1 (or localhost) means “on this machine”, which makes cloud instance of your n8n believe that your are running Ollama on n8n cloud as well (in the same instance as n8n), which is probably isn’t the case.