Unable to connect Opus 4.5 model on Azure AI Foundry to n8n AI Agent node

Hi everyone,

I’m trying to connect an Opus 4.5 model deployed on Microsoft Azure AI Foundry to the AI Agent node in n8n, but I haven’t been able to get it working.

I tried reusing the Azure OpenAI node, assuming it would be compatible, because they both require the same fields:

  • Endpoint

  • Model version

  • Deployment name

  • API key

The model works fine from other tools/SDKs with this configuration, so the Azure AI Foundry deployment itself seems correct. However, in n8n I only manage to get things working when using plain OpenAI; whenever I try to use anything that’s not “direct OpenAI” (in this case Anthropic Opus 4.5), the AI Agent node fails or doesn’t respond as expected.

My questions are:

  • Does the AI Agent node currently only support direct OpenAI? Why?

  • Is there any “official” or recommended workaround to use Azure AI Foundry models (not OpenAI) with the AI Agent node?

  • Is there any update planned to support others Azure Foundry models?

If you need logs, screenshots, or my exact node configuration, I’m happy to share them.

Thanks in advance!

1 Like

I’m having the exact same problem. My company has a strict policy on AI, only the models through azure are allowed. I managed to connect the Azure OpenAI models without issues but wanted to try Claude 4.5 through the Azure Foundry. It does not work.

Any help or official answer is much appreciated?

There is a request that you can vote on: Is it possible to use newly added sonnet models with Azure AI Foundry using Azure OpenAI Chat Model?