Ollama Cloud Connection Tested Successfully but Model List Shows No Results

Hi everyone! I’m new to n8n and trying to set up a connection to Ollama Cloud. I successfully copied the API key from Sign in and created a new host credential connection in n8n, changing the base URL to https://ollama.com. The connection test shows “Connection tested successfully” status, but when I try to use it, the model list doesn’t load and I just get “No results” message.

Has anyone encountered this issue before? The connection seems to authenticate properly, but I can’t access any models. Any help would be greatly appreciated!

Thanks in advance,

1 Like

Hi @claw Welcome to the community!
n8n Ollama credentials depend on a live Ollama API endpoint (local or supporting proxy) not https://ollama.com even if the HTTP check succeeds the node fails to list models Point the Base URL to an actual Ollama API (like a local http://localhost:11434 or proxy URL)

hey welcome to n8n! the issue is your base URL needs to be https://ollama.com/api not just https://ollama.com, the connection test passes because the server responds but the actual model listing endpoint needs that /api path. also heads up that Ollama Cloud gives you access to their cloud-hosted models (the ones with -cloud suffix like qwen3-coder:480b-cloud) not any local models you might have installed, so once you fix the URL you should see those cloud models populate in the list.

yes i use ollama cloud but nothing works can you show me please?

Are you sure?!!

I just did exactly what you explained, and it is actually working:


Note that I’m using the Message a model node from Ollama:


or with the AI Agent as the Ollama Chat Model:

However, if you’re talking about linking the OpenAI node as an OpenAI-compatible API, I tested that as well, and it indeed does not work with the base URL https://ollama.com, but it works with https://ollama.com/v1:

So go with whatever suits you best, I hope this solves the issue..

2 Likes