Request for HTTP Streaming Support in Respond to Webhook Node

I am writing to inquire about the possibility of adding HTTP streaming support to the Respond to Webhook node in n8n. This feature would be incredibly valuable for AI nodes and could significantly enhance the capabilities of n8n workflows.

Enabling HTTP streaming in the Respond to Webhook node would offer several key benefits:

  1. Real-time AI processing: AI nodes could receive and process data in real-time, allowing for faster and more responsive workflows.

  2. Efficient data transfer: Streaming data would eliminate the need for buffering large datasets, reducing memory usage and improving overall performance.

  3. Enhanced interoperability: Many AI services and APIs support HTTP streaming, and enabling this feature in n8n would make it easier to integrate with these services seamlessly.

  4. Improved user experience: HTTP streaming would provide a smoother and more responsive user experience for applications that rely on AI-powered automation.

I understand that implementing HTTP streaming support may require some development effort, but I strongly believe that the benefits would be well worth the investment. It would position n8n as a leading platform for AI-driven automation and attract more users and developers to the community.

I would love to hear your thoughts on this proposal. Is this something that the n8n team would consider implementing in the future?

Thank you for your attention and for the incredible work you do in developing and maintaining n8n.

I’m also desperately waiting for this feature from a long time! Please someone advice…Thanks!

Yes please!