I want to make my agentic chat as streaming output

Hi,

I am using simple agent with having mcp client node, So I have connected the webhook node to agent and agent is connected to ‘Respond to webhook’ node. I want my AI agent chat output as streaming, right now it’s just sending the output as RestAPI type, is there any way to make this chat interaction as streaming.

Hey @Abhiru_Wijesinghe !
Yes and no.

  1. Yes it is possible - you can do a redirection to a some message broker like a database and read->print events. example - Real LLM Streaming with n8n – Here’s How (with a Little Help from Supabase) - Demo Domain
  2. Check out this thread Stream AI responses on HTTP responses, LLM chains, and AI agents nodes - #18 by Daniel_Freitas
1 Like

Hey @Abhiru_Wijesinghe , it is possible now

Recently n8n launched the new streaming feature that will surely help in your use case.

You can checkout this video to learn how you can setup the same.

4 Likes

Thanks Brother

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.