Hi, I (as a novice user) am trying to create a flow using an AI agent node calling upon a self-hosted instance of Ollama. N8N runs within a Docker container, while Ollama runs directly on my Windows PC. Nevertheless I get a “ECONNREFUSED” error while requesting the “http://localhost:11434” as base URL. I can ping though this address from the concerned Docker container using shell. Any suggestion on how to address/troubleshoot this? Thanks!
It looks like your topic is missing some important information. Could you provide the following if applicable.
- n8n version:
- Database (default: SQLite):
- n8n EXECUTIONS_PROCESS setting (default: own, main):
- Running n8n via (Docker, npm, n8n cloud, desktop app):
- Operating system:
Hey Eric, welcome to the n8n community
When you run n8n inside docker, it gets it’s own network, and localhost
inside the container means the container itself, and not the host (where you are running ollama).
you could try running ollama in docker as well, and then use http://ollama:11434
instead.
Thanks netroy! I am now one step further thanks to you. The http://ollama:11434 URL wasn’t recognized, but I replaced it by the Docker IP address and it worked like a charm. For those interested: Open cmd prompt>type: docker ps>type: docker inspect [Container_ID]>field “IP address”
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.