Chat Hub breaks Basic LLM Chain

Describe the problem/error/question

Hey folks,

Activating the Chat Hub in the Chat Trigger also enables the streaming mode, which is not compatible with Basic LLM Chain. This doesn’t happen with public chat.

What is the error message (if any)?

[No response received. This could happen if streaming is enabled in the trigger but disabled in agent node(s)]

Please share your workflow

Share the output returned by the last node

Information on your n8n setup

  • n8n version: 2.13.4
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker self hosted
  • Operating system: Ubuntu

Hi @eric-burel

What I would do here is align both sides of the configuration.
If you want to use Chat Hub, the safest path is to replace the Basic LLM Chain with a node that supports streaming, such as Tools Agent. On the other hand, if you want to keep using the Basic LLM Chain, then the solution is to avoid using Chat Hub and stick with the standard chat mode, where streaming is not enforced.

Hi @eric-burel i think the AI agent would work fine instead of using a LLM chain, as basic llm chain does not support streaming.

Hi, the Basic LLM Chain being plugged to a Chat Trigger as a default, the best solution could be to have an option to disable streaming in the Chat Hub UI I think, this is a common issue in other frameworks too (LangChain, Mastra) because the agent may or may not be fit for streaming.

From what I can see, in that setup streaming is expected, and that fits well with nodes such as AI Agent, but not as well with Basic LLM Chain.
At the moment though, I don’t think this is exposed as a workflow-level option, so in practice the safe path is still either to use a streaming-compatible node with Chat Hub or to use Basic LLM Chain outside that setup.
If this is something you’d like to see supported, it could be worth opening a topic in the Features category so the n8n team can track it as a potential improvement.