Add Bearer to Ollama to support Ollama Turbo mode?

Ollama recently added turbo mode, which uses their hardware to run models for a small monthly fee. I know a lot of people are having a hard time wrapping their heads around it. No need to remind me how it’s not local. My use-case is unique.
Right now I’m using a hacked-together shim to layer in the Bearers token. I’m not qualified to do such things. Would it be possible to get a switch in the ollama auth that turns on turbo mode and allows addition of the new API key?

Thanks for reading my request.
Ash

2 Likes

Hey there! I work on Ollama. Sorry you hit this issue. Soon you’ll be able to skip the bearer token and use a local Ollama install to connect to Turbo models directly. Stay tuned for that.

In the meantime, adding Bearer token to the n8n auth package for Ollama would still be helpful – in case folks aren’t able to run Ollama locally. n8n team, let me know if I can be helpful at all with that :blush:

4 Likes

I confirm we need this !

1 Like

Hi Jeff,
As luck would have it, my email server collided with my fat fingers, resulting in a giant loss of data. I just want to touch base and let you know that I’ve got n8n sending data to ollama turbo. The results are amazing. The quality and speed of output is absolutely phenomenal using the gpt-oss:120b model.

Thanks to the hard work of the Ollama and n8n team, I can finally expand the pipeline I’ve been working on.

I can not fully express the breadth of my gratitude.

So what’s the secret to get this to work? I’ve set up an Ollama credential in n8n that’s pointing to https://ollama.com with my Turbo API key… (as per https://www.youtube.com/watch?v=0lDJOz6TA24) and it sees the models… however I keep getting “Invalid tool usage: mismatch between tool calls and tool results” errors any suggestions?

Got the same issue. Any progress on this ?. It seems to only fail with Turbo API and not fail when using a local model with tool support.

I’m running Ollama Turbo through OpenWebUI as a proxy - seems to work that way.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.