Unable to connect n8n to local (NO DOCKER!) Ollama instance (Service unavailable) - Ollama running per curl reachable

Describe the problem/error/question

i have installed n8n and ollama directly (so without docker whatshoever), ollama running i can access with curl / rest calls (for example, i get response from http://localhost:11434/api/tags (also with 127.0.0.1 i tried.)

i am behind proxy, but since i am running this locally, it shouldn’t be any issue..

What is the error message (if any)?

Couldn’t connect with these settings

Service unavailable - try again later or consider setting this node to retry automatically (in the node settings)

in the chain logs:
Ollama call failed with status code 404: model ‘llama3.2:latest’ not found - which is true, because it is not installed. i cannot change the model however manually to installed ones :frowning:

Information on your n8n setup

  • n8n version: 1.110.1
  • Database (default: SQLite): default
  • n8n EXECUTIONS_PROCESS setting (default: own, main): npm install
  • Running n8n via (Docker, npm, n8n cloud, desktop app): npm
  • Operating system: win11

Hey there, I can recommend 2 approaches:

  • Try to pull the latest model with ollama pull llama3.2

If not possible:

  • Use the HTTP node to make requests to Ollama manually:
POST http://localhost:11434/api/generate
Content-Type: application/json

{
  "model": "gemma:2b",
  "prompt": "Hello, world"
}

actually.. i was able to solve the issue, the problem was npm itself.. the proxy handling part..

i had to get rid of the proxy settings .npmrc completely.. :frowning: (set them to null)

really strange however, that even 127.0.0.1 tried to get through the proxy..

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.