You have this error because n8n inside his own container cannot see the model in ollama.
Solution 1:
Create a network
docker network create n8n-network
run Ollama on this network
docker run -d --name ollama --network n8n-network -p 11434:11434 ollama/ollama
run n8n on the same network
docker run -it --rm --name n8n --network n8n-network -p 5678:5678 -v n8n_data:/home/node/.n8n docker.n8n.io/n8nio/n8n
In ollama use :
Solution 2:
You can also use a docker compose file docker-compose.yml to do the same things
version: ‘3.8’
services:
ollama:
image: ollama/ollama
container_name: ollama
ports:
- “11434:11434”
volumes:
- ollama_data:/root/.ollama
n8n:
image: docker.n8n.io/n8nio/n8n
container_name: n8n
ports:
- “5678:5678”
volumes:
- n8n_data:/home/node/.n8n
depends_on:
- ollama
environment:
- N8N_HOST=0.0.0.0
volumes:
ollama_data:
n8n_data:
And That’s it !!!