First, closely related to Error with ollama embedding model but a bit different
Describe the problem/error/question
When I set up the ollama embedding node, it will not “accept” the model I have loaded for embedding, but when I generate the query using http request node it work and returns a good reply with the vectors.
Update for some reason it suddenly accepted the model, so I tested, but still not working.
What is the error message (if any)?
Error in sub-node ‘Embeddings Ollama1‘
model " nomic-embed-text:latest" not found, try pulling it first
Please share your workflow
Not much of a workflow yet. Building out my moving parts like queries before going to more complex stuff. Just the isolated ollama embed node failing and the http request node working.
and
http://host.docker.internal:11434/api/embedding
Headers:
{
"Content-Type": "application/json",
"Accept": "*/*",
"User-Agent": "python-requests/2.32.3"
}
Body (JSON):
{
"model": "nomic-embed-text:latest",
"input": "This is the text I want to embed"
}
Share the output returned by the last node
The node won’t accept the model, so cannot even test
Information on your n8n setup
- n8n version: 1.70.3
- Database (default: SQLite): Postgres
- n8n EXECUTIONS_PROCESS setting (default: own, main): No idea
- Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
- Operating system: Win11 host, Ubuntu WSL to create containers