Using the Chat Trigger and provide status updates while the AI Agent is processing

I have a workflow that has a chat trigger calling an AI Agent. The agent has access to several tools. It can take a minute+ to get a response. I’d like to provide status updates in the chat during the processing like:

  • Now running tool XYZ
  • Now looking up ABC

Some ideas I’m hoping for:

  • Given the sessionId, can I trigger something to return data back to the chat client
1 Like

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

You could potentially implement this using:

  1. A main workflow with Chat Trigger → AI Agent
  2. Secondary “status update” workflows that can post to the same chat session

So I would attempt to get the session of the ID of the chat session, send that to the second scenario where you can use an http request to post the updates.

If this solves your problem, feel free to mark as the solution!

Thank you Daniel. In the secondary “status update” workflow, given the sessionId, how do I send the HTTP Post back to the n8n chat client. It looks like the n8n chat client is sending a synchronous (meaning sending and waiting for a single response) web request to the n8n server, and waiting for a single response. There’s no way to address (i.e. communicate) with the n8n chat client directly to send it information, which is what I’m trying to do.

The traditional approach (outside of n8n), would be a chat client that the server could talk to. Either the chat client “polling” the server for updates based on the sessionId, or setting up a socket connection between the two. If n8n has already built this into their chat client, that would be awesome.

It seems that sending status updates is a wanted feature, but at this moment it is not supported.

The user must wait for the full agent execution to see the output/intermediate steps.

It is a feature request though

if this answers your question please hit the ‘solution’ button

2 Likes

Where is it a feature request? Searching the n8n feature requests for “status updates” finds nothing

Couldn’t this be done with a custom AI tool? You would just have the agent call the tool after using any other tool and provide a status update

@liquidsnakeblue My point is that the Chat client by n8n should support streaming updates. You can see this working in every standard chat client (ChatGPT, Claude, etc).

Most AI APIs support a streaming option. For example, I’ve built many custom chat clients with Open AI where we are streaming updates.

In addition, it’s critical to support streaming updates when function calling. Open AI provides this already when calling function tools. So you can do something like this:
User: “When was my last interaction with [email protected]”.
Open AI: “Ok, let me check your email”
Open AI: “I’m also going to check your Google Calendar”

1 Like

I 100% agree with you and I hope they will add this to the AI Agent node.

I was more talking about a workaround in the meantime…

I managed to get status updates using the call n8n workflow AI node and having the agent send a status of what its up to in a custom field on that node. The new workflow gets the status which I saved into memory and then added a webhook to get the status.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.