Description
I am experiencing an issue when using Azure OpenAI nodes with AI Agent in n8n when a custom gateway / proxy endpoint is involved.
Environment
-
n8n: Docker deployment
-
Using AI Agent
-
Backend: Azure OpenAI
-
Access pattern: Azure OpenAI is accessed via a custom gateway domain, not directly via
*.openai.azure.com
What works
-
Using HTTP Request node, the following Azure-style endpoint works correctly:
POST /openai/deployments/{deployment}/chat/completions -
The request succeeds with the same headers, model, and payload.
What fails
-
Using Azure OpenAI Chat Model / Azure OpenAI node (required by AI Agent) fails with:
The resource you are requesting could not be found -
This happens even though:
-
DNS resolution works
-
The same deployment and API key work via HTTP Request
-
The gateway correctly forwards traffic to Azure OpenAI
-
Observations
-
The Azure OpenAI node requires an “Instance / Resource Name” field.
-
This field appears to be treated as an Azure resource name and internally used to construct:
https://{instance}.openai.azure.com -
When a custom gateway domain is provided instead, the node appears to:
-
Construct an invalid endpoint, or
-
Combine
instance nameandendpointincorrectly
-
-
As a result, the Azure OpenAI node does not support custom gateway domains, even though HTTP Request does.
Limitation Impact
-
AI Agent does not support HTTP Request as a model provider.
-
Therefore, users behind gateways/proxies cannot use Azure OpenAI with AI Agent unless:
-
The gateway fully mimics the official Azure OpenAI endpoint, or
-
They switch to OpenAI-compatible endpoints and credentials.
-
Expected Behavior
One of the following would resolve the issue:
-
Allow Azure OpenAI node to use a fully custom base endpoint without requiring
instance/resource name -
Support Azure OpenAI behind gateways/proxies
-
Provide clearer validation or documentation that Azure OpenAI node does not support custom domains
Workaround
-
Expose an OpenAI-compatible endpoint (
/v1/chat/completions) at the gateway -
Use OpenAI Credential + OpenAI Chat Model instead of Azure OpenAI node
Summary
-
HTTP Request → works
-
Azure OpenAI node → fails
-
Root cause appears to be hard dependency on Azure resource name–based URL construction
-
This blocks AI Agent usage in gateway/proxy architectures
Information on your n8n setup
-
**n8n version:**Version 1.117.3
-
Database (default: SQLite): postgreSQ:
-
n8n EXECUTIONS_PROCESS setting (default: own, main):
-
Running n8n via (Docker, npm, n8n cloud, desktop app): docker
-
Operating system: