AFAIK, Currently the OpenAI Node in n8n (and also the standard HTTP node in n8n) cannot get response as a stream of text (event though OpenAI api allows for it). Respond-To-Webhook node in n8n also doesn’t allow for sending a response as a stream.
Use OpenAI in n8n in a custom workflow triggered via Webhook and sending response as a stream of text tokens.
Applications using streaming for getting OpenAI responses feel much more snappy from UX perspective than apps that return response in one “block”.
I’d need to learn a lot about n8n internals. But yea why not.