LLM response not shown in Editor Chat, but works in Executions!

When using the AI Agent with Azure OpenAI Chat Model in n8n, the Editor Chat panel does not display any response. However, the same workflow produces correct responses in the Executions tab. No error messages are shown in the Editor.

Steps to reproduce:

  • Workflow: When chat message receivedAI Agent (Azure OpenAI Chat Model)Simple Memory.

  • Send a message in the Editor Chat panel → no response.

  • Run workflow in Executions → response is shown.

Expected: Editor Chat should display the model response.
Actual: Editor Chat stays blank, while Executions confirm the LLM replied.

Environment: n8n , <cloud/self-hosted>, , Azure OpenAI <model/version

In my opinion it may occur due to load on server.
you can always use copy to editor button in that specific execution to view that in editor mode.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.