Feature Request: Adding the thinking level option (ie. low/high) for the Gemini 3 Pro model in the Open Router Chat Model node

LLM

The idea is:

To add the thinking level option (ie. low/high) for the Open Router Chat Model node, for LLM models like Gemini 3 Pro

My use case:

Doing various reasoning tasks using the Gemini 3 Pro model via Open Router

I think it would be beneficial to add this because:

It would make it easier to use the thinking level that you wish to use - right now the only way to do it is to call it via the HTTP node, directly via API

Any resources to support this?

Are you willing to work on this?

Yes, this makes sense to have in the node instead of forcing people to use an HTTP Request, assuming OpenRouter actually exposes this parameter for Gemini 3 Pro.
Also, it’d be great to align the OpenRouter node options with the native OpenAI node when using GPT models via OpenRouter (search, code interpreter, file search, plus extras like prompt overrides, metadata, third party tools, etc.), otherwise you lose a lot of capabilities.
I’m happy to contribute, but we should first confirm the OpenRouter spec (exact field names + allowed values) and get a working example payload.

Summary

Request: add a “thinking level” option to the OpenRouter chat node, and ideally expose OpenAI-like options when using GPT via OpenRouter. I can help implement it once the OpenRouter parameter names/values and a working payload are confirmed.

1 Like