LLM response not shown in Editor Chat, but works in Executions

When using the AI Agent with Azure OpenAI Chat Model in n8n, the Editor Chat panel does not display any response. However, the same workflow produces correct responses in the Executions tab. No error messages are shown in the Editor.

Steps to reproduce:

  • Workflow: When chat message receivedAI Agent (Azure OpenAI Chat Model)Simple Memory.

  • Send a message in the Editor Chat panel → no response.

  • Run workflow in Executions → response is shown.

Expected: Editor Chat should display the model response.
Actual: Editor Chat stays blank, while Executions confirm the LLM replied.

Environment: n8n , <cloud/self-hosted>, , Azure OpenAI <model/version

@Irem_GUNER This issue might be due to how the AI Agent node interacts with the Editor Chat panel in n8n. Since the Executions tab shows the correct responses, the problem seems to be with the real time display in the Editor Chat. You may want to check if there are any configuration differences between the Editor Chat environment and the execution environment. Also, ensure that the Simple memory node is properly set up to store and return the responses.

Let me know what you come up with.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.