bump! streaming is so pivotal for any B2C AI agent solution now.
With “perceived” lower latency by users with just streaming this can be a HUGE add and create so much value tbh. Suprised we don’t have this, lots of other solutions do now
bump! streaming is so pivotal for any B2C AI agent solution now.
With “perceived” lower latency by users with just streaming this can be a HUGE add and create so much value tbh. Suprised we don’t have this, lots of other solutions do now
This would be very useful…
This would be such a gamechanger since any user expects the frontend to be responsive nowadays. Especially when building LLM workflows with tools or complex behaviour, it is a pain if the user waits for like 30 seconds after hitting send before anything is displayed. Would be highly appreciated to enable LLM streaming wherever possible!
Bump this topic, we need more vote.
^ everything above
This is absolutely an essential feature! AI applications that don’t support streaming responses would be very disappointing, and I’m considering whether to continue using them.
This would be seriously useful for my team and I (And a lot of other people too, I’m sure!)
This is currently the necessary path to AI integration.
Critical for any modern AI use case. When combined with the duplicate feature request (refer below), this is now the most requested feature by a significant margin.
Please prioritise this feature!
I don’t know why anyone is even wasting their time posting in here.
n8n is a farce when it comes to support, responding to client needs or anything along those lines. I’ve never been as maddened by the absence of any meaningful support in a paid product.