Hi everyone,
I’m having trouble setting up Ollama as part of the n8n Self-hosted AI Starter Kit. Despite following the provided instructions, Ollama does not seem to work as expected. Here’s a detailed description of the problem:
Problem Description:
1. After running the Docker containers for the starter kit, the Ollama container is up and running, but I cannot pull any models (e.g., Llama) using ollama pull llama.
2. The error message returned is:
pulling manifest
Error: pull model manifest: file does not exist
3. Running curl -X GET http://localhost:11434/models returns a 404 page not found error.
4. ollama list inside the container shows no models available.
5. I’ve already confirmed that Docker is working properly and other containers in the starter kit (n8n, Qdrant, PostgreSQL) are functioning without issues.
Steps Taken to Troubleshoot:
• Verified network connectivity from inside the Ollama container using curl (successfully reached https://ollama.com).
• Re-pulled the ollama/ollama:latest Docker image and recreated the container.
• Ran the docker compose --profile cpu up command to ensure the kit is running in CPU mode as I’m using a Mac with Apple Silicon.
• Inspected the logs of the Ollama container, which show the service is running but does not load any models.
Expected Behavior:
I expected Ollama to allow me to pull models (e.g., Llama) for local inference without requiring additional configuration.
Current Behavior:
The Ollama service is running but does not provide access to models or allow pulling models.
Additional Information:
• n8n version: 1.70.4
• Database (default: SQLite): PostgreSQL (part of the starter kit)
• n8n EXECUTIONS_PROCESS setting: Default (own)
• Running n8n via: Docker (using the Self-hosted AI Starter Kit)
• Operating system: macOS Ventura, M1 processor
Questions:
• Is there additional configuration needed for Ollama to pull models in this setup?
• Are there compatibility issues with Apple Silicon or macOS in CPU mode?
• Could this be a network or configuration issue specific to the Docker image?
Thanks in advance for your help! Let me know if I need to provide any additional information.