Bad request - please check your parameters
Unsupported parameter: ‘response_format’. In the Responses API, this parameter has moved to ‘text.format’. Try again with the new parameter. See the API documentation for more information: https://platform.openai.com/docs/api-reference/responses/create.
When I connect my AI Agent with the Azure 5.2 model, I get this error, and I think it’s an n8n issue. Is anyone else experiencing this?
Hi @meellaadoo
Hard to tell with this little info but the n8n node is configured for the old API, so it sends a response_format parameter, but the new API requires this parameter to be nested inside text.format .
Hi @meellaadoo
,
Yep, this is happening to a lot of people right now, and it’s mainly an n8n connector issue, not your Azure model
Your screenshot tells us n8n’s LangChain AI Agent node ( @n8n/n8n-nodes-langchain.agent ) is sending a request payload that contains:
“response_format”: …
But, the endpoint it’s hitting expects the new Responses API format, where output formatting is under:
“text”: { “format”: … }
So when you select Azure GPT-5.2, n8n likely switches to a newer OpenAI/Azure Responses API style call, but still sends an older parameter, causing a 400 error.
So what’s the solution for this?