Support json_schema in response format OpenAI (AzureAI) Chat Model, Three suggestion:

OpenAI and Azure AI support the ability to provide a json_schema as a response type configuration.

https://platform.openai.com/docs/guides/structured-outputs

I see three options to make this happen:

  1. Update existing Chat Model nodes
  • Provide the ability to select "json_schema" as the response format in both the Azure OpenAI Chat Model and the OpenAI Chat Model.
  • When the response type is set to "json_schema", enable a text field to define the schema.
  1. Allow full request configuration in Chat Models
  • In addition to predefined dropdown options for configuration values, provide the ability to specify a fully custom JSON configuration, including all necessary parameters.
  1. Provide documentation on creating a custom Chat Model node
  • Offer clear guidance on how to implement a custom Chat Model node to support specific requirements.

n8n version: 1.81.4
Database (default: SQLite): SQLite
n8n EXECUTIONS_PROCESS setting (default: own, main): Default
Running n8n via (Docker, npm, n8n cloud, desktop app): Docker and n8n
Operating system: Mac

1 Like

This may help as a temporary workaround:

2 Likes

“Extractor node” is not a solution.

Let’s examine a specific use case: Intent Classification

JSON Schema allows me to provide the LLM with a set of possible user intents and request the model to determine the user’s intent based on their query, including extracting relevant properties from the request.

A structured response is not the solution—please do not suggest it as one.

There is a dedicated functionality in OpenAI services for this, which is currently not supported by n8n.

When in the agent node, you can click the plus to add a tool, you can then use Structured Output Parser natively via the AI agent node.

It’s not a solution!

I want to use a native functionality from LLM.
It’s highly useful and important for intent classification.

2 Likes

Yes, this is great functionality from Open AI and I need it too so much.