I’m running n8n locally on a Windows workstation using Docker Compose. Ollama runs locally on the same machine (Windows host).
Ollama works fine inside workflows (e.g., “When chat message received” → “Basic LLM Chain” → “Ollama Chat Model”), but the n8n Chat (Beta) UI always fails with “Something went wrong. Please try again.” even when accessing n8n locally.
So the LLM connection is OK, but Chat (Beta) seems to trigger an agent/tooling requirement and fails.
Error during session title generation workflow execution: Error: Tools Agent requires Chat Model which supports Tools calling
Title generation failed: Error: Tools Agent requires Chat Model which supports Tools calling
Information n8n setup
n8n version: 2.1.1
Database (default: SQLite):
n8n EXECUTIONS_PROCESS setting (default: own, main): default (not explicitly set)
Running n8n via: Docker Compose (image n8nio/n8n:next)
Operating system: Windows (workstation), GPU: RTX 6000 Blackwell, RAM: 96GB
Additional details
-
Ollama is running on the Windows host (not in Docker).
-
n8n container reaches Ollama via:
http://host.docker.internal:11434 -
extra_hosts: "host.docker.internal:host-gateway"is set in compose. -
Ollama itself responds correctly to
/api/chatwith the same model.
Questions
-
Does Chat (Beta) always require a Tools Agent / tools calling (e.g., for session title generation)?
-
Is there a way to disable session title generation or force a non-tools conversational mode in Chat (Beta) when using Ollama?
-
Is this a known issue in 2.1.1 and fixed in a newer release?