I am trying to integrate n8n with the Ollama model installed locally on my PC. However, I am unable to establish a connection. I have tried configuring the Base URL with the following values:
http://localhost:11434/
http://127.0.0.1:11434/
Despite multiple attempts, the connection fails, displaying an ECONNREFUSED error.
I have searched the documentation, YouTube tutorials, and online resources, but none of them mention encountering this issue. Most guides simply use one of the above URLs, and Ollama connects successfully.
I would appreciate any guidance on resolving this issue.
What is the error message (if any)?
Couldnāt connect with these settings ECONNREFUSED
I have the same issue. @Sahil_Kayastha did you manage to find a solution?
In my case Iām running Ollama locally (not in a container) and my n8n workflow is hosted in the n8n cloud. Both http://localhost:11434/ and http://127.0.0.1:11434/ return 200 status and Ollama is running text. According to the documentation this should be enough but I am also getting ECONNREFUSED
I still get the same and I have tried [localhost]/api/tags on an n8n HTTP Request node which results in the same error, whereas a curl or call from the browser is OK. This means the issue is āsimplyā that localhost wonāt accept external connections.
Setting up n8n locally resolves the issue although means importing or recreating your workflow locally - Iām a novice on this platform so maybe thereās an easy way to transfer workflows between the cloud and local.
To set up n8n locally:
npm install -g n8n
n8n start
Allowing connections from the n8n cloud to Ollama localhost could also work but requires managing security issues, so Iām NOT recommending it.