How to connect an Azure Foundry AI model to the AI agent as a Chat Model?

Hi,

I really don’t understand how to use a LLM model from Azure Foundry AI as a chat model for my AI agent on n8n…

I am trying to use the GPT OSS 120b, as it is one of the cheapest and smartest.

Unfortunately, n8n doesn’t allow a Azure Foudry credential (I really don’t get why), and it is impossible to get the GPT OSS 120b as an Azure Open AI model…

I tried to use a proxy, to make believe n8n that it is talking to an Open AI api, but it raised “Resource temporarily blocked for unusual behavior”. I think that’s because of the use of the proxy.

Can someone know how to do so?

Thank you in advance for your help,

Have a nice day,

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.