OpenRouter + AI Agent fails when a tool is attached (even if tool is not called) – started 10.01.2026 ~12:00 MSK

Hello n8n team and community,

I’m reporting a sudden breaking issue related to OpenRouter and AI Agent, which started on 10.01.2026 around 12:00 MSK.

This issue is not related to calling a tool — the failure happens even before any tool is invoked.


:cross_mark: What is happening

  • When any tool (specifically Call n8n Workflow Tool) is attached to an AI Agent node:
    • the execution fails immediately
    • even if the tool is never called
    • the agent logic does not reach tool execution
  • When the same AI Agent is executed without the tool attached:
    • the execution completes successfully
    • with the same inputs and prompts
{
  "errorMessage": "Bad request - please check your parameters",
  "errorDescription": "Provider returned error",
  "errorDetails": {},
  "n8nDetails": {
    "time": "10.01.2026, 15:40:43",
    "n8nVersion": "2.2.4 (Self Hosted)",
    "binaryDataMode": "filesystem"
  }
}

:microscope: Critical observations

  • The tool does not need to be called — the error occurs simply because it is attached
  • This suggests the failure happens when the tools schema is sent to the language model
  • The same behavior occurs when:
    • using OpenAI Chat Model node
    • but replacing OpenAI credentials with OpenRouter credentials
  • When using real OpenAI credentials, everything works correctly

:repeat_button: Reproducibility

  • Reproduced on multiple self-hosted servers
  • Reproduced on different n8n versions
  • Workflow and tool schema were not changed
  • The issue started suddenly on the same day

:red_question_mark: Conclusion / Question

This strongly suggests a recent change in OpenRouter behavior when handling requests that include tools metadata, even if tools are not invoked.

Has anyone else experienced similar issues with OpenRouter and AI Agent starting today?

I’ve experienced similar issues many times, especially when using the Structured Output Parser.

My fix is usually to use the provider node directly. So, if I want to call an OpenAI model, I simply connect the OpenAI node instead of the OpenRouter node.

I’m not sure if this is an issue with the n8n node or if the OpenRouter API doesn’t handle tools and the Structured Output Parser very well.