Openrouter Model Provider Selection e.g. Cerebras

Thanks for the help in advance! I’m wondering if anyone has successfully chosen a specific model provider within the Openrouter model node, like Qwen: Qwen3 32B from Cerebras. I know it’s possible through a standard API call, but am I correct in thinking this option isn’t directly available within the Openrouter model node? I assume the n8n Openrouter node defaults to Openrouter routing the LLM call to the best available provider, but having the ability to select a specific provider would be great. Thanks again!

It’s already select model tho.
What version that you used for n8n?

Edit: Ah I see, the provider

So yap, you cannot change provider with Openrouter node right now.
You can use "advanced" method for that, like
Chat Trigger → Load Memory → add to HTTP → Save Memory → Repeat

I didn’t knew this methot, is this a different node or how does it work?

Edit: I understood what you meant, but It stops being useful when it comes to build Ai Agents with tool calling, I guess we have to conform just to AI chatting OpenRouter’s Specific providers for now.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.