Who maintains the "OpenRouter Chat Model" node?

Hey,

I’m using the Open Router Chat Model node to access LLM models in n8n. Ie. OpenRouter Chat Model node documentation | n8n Docs

I have a question regarding Node parameters, who should I contact?

The issues is that there is no option to set the thinking level (ie. low/high) for the Gemini 3 Pro model.

Please advise. Thanks!

I believe open router maintains it. However for the gemini models, you cant set the low high, but you can adjust parameters such as sampling tempature, token limit, ect, to configure the models creativity, logic, and response.

You can if you do it via the HTTP node and do an API call but yea wish I could just do it via the native OpenRouter node.

Yea, that would be easier, I would contact them or suggest the feature if they have a spot on their site.

Hi @Samuel100,

It’s part of the official open source code base of n8n and since it is open source, anyone can maintain it with approval from the n8n team via a pull request.

Your options are:

  1. to fork the n8n code base and add the missing options in the file below, or;
  2. you could follow the documentation on contributing to the code base of n8n yourself for everyone to benefit.

If you’re not comfortable with coding yourself, then I guess you could open an issue on github for this

2 Likes