Describe the problem/error/question
i have installed n8n and ollama directly (so without docker whatshoever), ollama running i can access with curl / rest calls (for example, i get response from http://localhost:11434/api/tags (also with 127.0.0.1 i tried.)
i am behind proxy, but since i am running this locally, it shouldn’t be any issue..
What is the error message (if any)?
Couldn’t connect with these settings
Service unavailable - try again later or consider setting this node to retry automatically (in the node settings)
in the chain logs:
Ollama call failed with status code 404: model ‘llama3.2:latest’ not found - which is true, because it is not installed. i cannot change the model however manually to installed ones ![]()
Information on your n8n setup
- n8n version: 1.110.1
- Database (default: SQLite): default
- n8n EXECUTIONS_PROCESS setting (default: own, main): npm install
- Running n8n via (Docker, npm, n8n cloud, desktop app): npm
- Operating system: win11