Describe the problem/error/question
When I have an agent node that needs an structured output, sometimes the LLM will give it to me in a code bubble or the format would not be perfect, for that I thought we had the “Auto-fixing Output Parser” but I never see the model part of it activating, and I think in cases like this it should definetly do
As you can see it moves from having a code bubble with the json to empty answer…
json definition is:
strong text{“type”: “object”,
“properties”: {
“comunication_update”: {
“type”: “string”,
“description”: “Details of the communication update”
},
“id”: {
“type”: “integer”,
“description”: “Unique identifier for the record”
},
“Status”: {
“type”: “string”,
“description”: “Current status of the record”
},
“Stage”: {
“type”: “string”,
“description”: “Current stage of the process”
}
},
“required”: [“comunication_update”, “id”, “Status”, “Stage”]
}
How is it that it doesn’t work? is this a bug?
Please share your workflow
{
"nodes": [
{
"parameters": {
"options": {}
},
"id": "f8855e94-f6c2-47bf-a8c9-965b21c6e110",
"name": "Auto-fixing Output Parser",
"type": "@n8n/n8n-nodes-langchain.outputParserAutofixing",
"typeVersion": 1,
"position": [
1960,
720
]
},
{
"parameters": {
"options": {}
},
"id": "bb01fc25-9993-4a01-8e4e-fcf8a5dae554",
"name": "OpenAI Chat Model1",
"type": "@n8n/n8n-nodes-langchain.lmChatOpenAi",
"typeVersion": 1,
"position": [
1920,
920
],
"credentials": {
"openAiApi": {
"id": "yv064MBJLbwsL817",
"name": "OpenAi fix output parser"
}
}
},
{
"parameters": {
"schemaType": "manual",
"inputSchema": "{\"type\": \"object\",\n \"properties\": {\n \"comunication_update\": {\n \"type\": \"string\",\n \"description\": \"Details of the communication update\"\n },\n \"id\": {\n \"type\": \"integer\",\n \"description\": \"Unique identifier for the record\"\n },\n \"Status\": {\n \"type\": \"string\",\n \"description\": \"Current status of the record\"\n },\n \"Stage\": {\n \"type\": \"string\",\n \"description\": \"Current stage of the process\"\n }\n },\n \"required\": [\"comunication_update\", \"id\", \"Status\", \"Stage\"]\n}"
},
"id": "5cb740bb-9bcc-4b9d-934d-b448b98f5f62",
"name": "Structured Output Parser",
"type": "@n8n/n8n-nodes-langchain.outputParserStructured",
"typeVersion": 1.2,
"position": [
2080,
920
]
}
],
"connections": {
"Auto-fixing Output Parser": {
"ai_outputParser": [
[]
]
},
"OpenAI Chat Model1": {
"ai_languageModel": [
[
{
"node": "Auto-fixing Output Parser",
"type": "ai_languageModel",
"index": 0
}
]
]
},
"Structured Output Parser": {
"ai_outputParser": [
[
{
"node": "Auto-fixing Output Parser",
"type": "ai_outputParser",
"index": 0
}
]
]
}
},
"pinData": {},
"meta": {
"templateCredsSetupCompleted": true,
"instanceId": "5aa5d2a1603df28859474f270c5663bf0de5fdf9d6c795d12d01619750c46ce4"
}
}
## Share the output returned by the last node
<!-- If you need help with data transformations, please also share your expected output. -->
## Information on your n8n setup
- **n8n version:** 1.76.3
- **Database (default: SQLite):** Postgres
- **n8n EXECUTIONS_PROCESS setting (default: own, main):**queue
- **Running n8n via (Docker, npm, n8n cloud, desktop app):** online deployed through Docker
- **Operating system:**: windows