Streaming response of HTTP node (or OpenAI Node) via Respond-To-Webhook Node

The idea is:

AFAIK, Currently the OpenAI Node in n8n (and also the standard HTTP node in n8n) cannot get response as a stream of text (event though OpenAI api allows for it). Respond-To-Webhook node in n8n also doesn’t allow for sending a response as a stream.

My use case:

Use OpenAI in n8n in a custom workflow triggered via Webhook and sending response as a stream of text tokens.

I think it would be beneficial to add this because:

Applications using streaming for getting OpenAI responses feel much more snappy from UX perspective than apps that return response in one “block”.

Any resources to support this?

Are you willing to work on this?

I’d need to learn a lot about n8n internals. But yea why not. :slight_smile:

Hi, I have exactly the same streaming problem as you. Can you tell me how you solved it?

1 Like

Hi, is this feature now supported? Or did you find a work around?