Support for Stream in HTTP Request node and OpenAI node

The idea is:

I would like to suggest adding support for Stream in the HTTP Request node in n8n. Adding support for Stream would greatly improve the experience of using n8n with APIs that require streaming data, such as ChatGPT.
I have tried using the ChatGPT node, but it doesn’t seem to support streaming. If I use a Code node instead, the HTTP-related request libraries are missing, making it impossible to use.

My use case:

To create a ChatGPT bot and it can provide responses in real-time, even if the first response is not a complete output. This can avoid the issue of having to wait for the full output before being able to see the content.

I think it would be beneficial to add this because:

I hope that both the HTTP Request node and the OpenAI node can support streaming functionality so that we can use APIs that support streaming and handle larger datasets.

Any resources to support this?

ChatGPT

PS: English is not my native language, so please let me know if there is anything in my suggestion that is unclear.

Hello, I have exactly the same streaming problem as you. Can you tell me how you did it?

This topic has also been a major issue for me. There hasn’t been any news from here for months, but I’m sure there are many users who have shifted to other solutions on the AI side. This doesn’t seem to be too complex for the n8n developers. If that’s the case, I hope they will consider it.

Did you found a work around since?

I don’t think a workaround would be easy to find. With so many chatbot builders on the market, we need to have a buitin function in n8n to use assistant stream in chat.

We need this. I can help if I have some guidance, but I don’t want to try doing it without understanding why it hasn’t been done yet.

Also really need this functionality for use with OpenAI

1 Like

Agree, this is very much needed - it would open up a whole new world of possibilities. Love using n8n , but the lack of support for streaming, for example, when using open AI, is a big downside.

1 Like

This is mission critical for my longterm use of N8n

3 Likes

I just discovered that streaming responses are not supported. I am pretty shocked at this. I look at n8n as a leader in delivering AI Agent functionality to the market and real-life applications like chatbots. Streaming chat is a given.

1 Like

really sad to learn this isn’t available, we’re building agents, we expect realtime interactivity

2 Likes

can’t believe that it is not availble still- n8n is the best on the market but MOST important feature is missing and we have to get the real pain with langflow / dify and other ones as streaming is not available :frowning:

4 Likes

Bump this topic, we need more vote :smiley:

1 Like

Streaming response is absolutely critical to using n8n for production use.

any one here we need this asap

We also really need this feature. Users have to wait way too long for AI responses

Also wanting this feature - curious to know if anyone has figured out any workarounds or if they are using a seperate product due to the lack of streaming ?

this is needed for AI Agent steps too!

It would be amazing if we could have this.

It would be nice if we have the “stream” option. Bump!

Or we can have a raw json for modifying…