Local Ollama3.2:3b failing to return JSON output

Hello!

Im running a n8n version(1.64.3) in docker and created an AI agent with ollama. My goal is to prepare the inputs for a calendar node. I want to return a structured output so I prompted the AI to return it as such.

“”
Your job is to determine when is the start of the event the end of the event and the title of the event.

You should return the following JSON object:

  • eventTitle: Title of the event
  • startDate: Event start time (in ISO format)
  • endDate: Event end time (in ISO format)

For example:
{
“eventTitle”: “Test”,
“eventStart”: “2024-11-22T10:00:00+01:00”,
“eventEnd”: “2024-11-22T11:00:00+01:00”
}
It is very important that your output should be the defined JSON object and not a string!
The chat input is:
{{ $json.chatInput }}
“”
I added the structure output parser like this.

{
"type": "object",
"properties": {
     "startDate": {"type": "string"},
      "endDate": {"type": "string"},
      "eventTitle": {"type": "string"}
}
}

In the end it cannot parse the output of the ai agent because its a text. Any idea how can i force it so it should return the required JSON object ?

Thank you very much for the help!

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hi @OmniusBG,

Here’s an example workflow that does this with LLama 3.2:3b.

I tweaked a prompt a bit because with these smaller models, you need to be more descriptive about your wants. But even then, it would probably still break on more complex prompts. I’d consider using at least an 8-b model for this.