Replacing chat model from GPT to Ollama produces wrong format

I’m using this workflow ✨🤖Automate Multi-Platform Social Media Content Creation with AI | n8n workflow template and I’m trying to replace GPT with my local Ollama.
Upon execution I’m getting an error since apparently Ollama returns JSON while GPT returns TEXT.

How can I modify or transform Ollama’s output to use only the text part of its response?

I’ve already chose Default in the model’s Output Format settings. The other option is JSON.

What is the error message (if any)?

Could be related: Add a Response format for Gemini Chat Model

Information on your n8n setup

  • n8n version: 1.83.2
  • Database (default: SQLite): Postgresql
  • n8n EXECUTIONS_PROCESS setting (default: own, main): ??
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
  • Operating system: Raspberry Pi OS 64 bit Lite

Hey @Zohar

So to my knowledge, all models actually return text - even if they’re instructed to return JSON, they just return the response as a JSON string. The structured output parser then does 2 things (1) parse it as JSON and (2) validate it against the given JSON schema/definition.

The error “Model output doesn’t fit required format” relates to the (2) step. I’ve seen this happen when the model returns empty values.

In the following example, if label is null - the AI couldn’t find the value - then the error would trigger as in the schema, label can only be allowed to be a string.

{
  "type": "object",
  "properties": {
    "label": { "type": "string" } 
 }
}

You could solve this by doing the following

"label": { "type": ["string", "null"] }

// which is equivalent to the JSON schema 3 spec
"label": { "type": "string", "nullable": true }

Would help if you could paste the full error here.

1 Like

Thank you @Jim_Le
Indeed this seems to be part of the problem. This is the message I receive:

System: Use the provided tools to research the topic based on latest information. Todays date is 2025-04-04T03:50:03.754-04:00\n\nIMPORTANT: For your response to user, you MUST use theformat_final_json_responsetool with your complete answer formatted according to the required schema. Do not attempt to format the JSON manually - always use this tool. Your response will be rejected if it is not properly formatted through this tool. Only use this tool once you are ready to provide your final answer.\nHuman: You are a **content creation AI** for workflows.diy, a leading creator of **n8n workflow...

So I added an output format: JSON.
The step is now running without warnings, but I still get the message above…

I didn’t find where to add the short json schema above.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.