Use OpenRouter Models Parameter

Describe the problem/error/question

I want to use the models parameter in the open router note but I only find a model parameter. (Model Routing | Dynamic AI Model Selection and Fallback | OpenRouter | Documentation)

How can I use this. This is a really important parameter.

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

No reply? Bump!

Hey @Dimitrij_Ovtcharov, the OpenRouter node does not support the models parameter (but you already know that, hopefully it will one day). But as a workaround you can make a direct API request using n8n’s HTTP Request node:

  • Method: POST
  • URL: https://openrouter.ai/api/v1/chat/completions
  • Headers:
{
  "Authorization": "Bearer YOUR_API_KEY",
  "Content-Type": "application/json"
}
  • Body (JSON):
{
  "models": ["mistral", "gpt-4-turbo", "claude-3-opus"],
  "messages": [
    {"role": "system", "content": "[YOUR SYSTEM PROMPT"},
    {"role": "user", "content": "[YOUR USER PROMPT]"}
  ]
}
1 Like

Can this option be used in any way to replace the model node in an AI agent?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.