Error using Ollama model node

As I’m using a remote Ollama instance that is secured behind basic auth, I’d need the Ollama model node credentials to support authentication.

I’ve tried adding the basic auth credentials in the URL directly, but this renders in an error:
2024-02-29_12-47-26

My URL is formatted like the following:
https://user:[email protected]

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

It does load the list of models, but the error in the screenshot occurs when the workflow runs.

hello @Loan_J

Basic auth usually means you must provide an Authentication header with a token. Try to use the HTTP Request node with basic auth. But in that way, you can’t use the Ollama node :frowning:

I am surprised that user:pass@host is failing, I would have expected that to work :thinking:

Indeed, that’s what I expected too.

What’s weird is that it does partially work, since it’s listing the available models. The error happens when the workflow runs.

It turns out it is a security thing that changed a while ago. At the moment the best option would be to use an http request node and manually make the calls that way.

How could I have the same behavior as this workflow using an HTTP node?

The issue is that my Ollama server is remote to my n8n server and the node doesn’t accept Basic Auth, nor the credentials support authentication, which means I’m stuck with nothing.

Hey @Loan_J,

I realised that the node is a model or embed so the http request node won’t be an option. the only current solution would be to run ollama locally as it was intended and connect n8n directly to it without needing the authentication.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.