As a developer, I want to build complex backend workflows (AI + classical) in n8n and integrate them into a small frontend app. The frontend should be able to call a workflow via Webhook and observe each node’s output in real time (not only AI nodes) to reflect progress and partial results in the UI.
Problem / Motivation
-
Many modern UIs (especially AI apps) need progressive, token/step-by-step updates instead of a single final response.
-
Today, a Webhook workflow typically returns only when the workflow finishes, which makes it hard to:
- Show live progress (“Running X… ✓ Done”)
- Stream intermediate results (LLM chunks, partial aggregations, logs)
- Gracefully handle/cancel long-running operations from the UI
Proposed Solution
Add a new Respond Mode to the Webhook node: “Streaming Nodes”.
-
When enabled, the webhook initializes an SSE (Server-Sent Events) response:
Content-Type: text/event-stream
Cache-Control: no-cache
Connection: keep-alive
- Flush after each event
-
During execution, n8n emits one SSE event per node (and optionally per run/item) with:
- Node lifecycle events:
node:start
,node:progress
,node:end
- Workflow lifecycle events:
workflow:start
,workflow:heartbeat
(optional),workflow:end
,workflow:error
- Node lifecycle events:
-
Each event carries a structured payload (see schema below) with safe, redacted node output.
Example (SSE wire format)
HTTP/1.1 200 OK
Content-Type: text/event-stream
Cache-Control: no-cache
Connection: keep-alive
event: workflow:start
data: {"workflowId":"abc123","executionId":"ex_789","startedAt":"2025-09-23T19:35:12.000Z"}
event: node:start
data: {"nodeId":"1","nodeName":"HTTP Request","runIndex":0,"startedAt":"2025-09-23T19:35:12.050Z"}
event: node:end
data: {"nodeId":"1","nodeName":"HTTP Request","runIndex":0,"finishedAt":"2025-09-23T19:35:12.320Z","status":"success","items":1,"outputPreview":{"body":{"message":"ok"}}}
event: node:start
data: {"nodeId":"2","nodeName":"LLM","runIndex":0,"startedAt":"2025-09-23T19:35:12.330Z"}
event: node:progress
data: {"nodeId":"2","nodeName":"LLM","runIndex":0,"progress":{"delta":"Hello "} }
event: node:progress
data: {"nodeId":"2","nodeName":"LLM","runIndex":0,"progress":{"delta":"world!"} }
event: node:end
data: {"nodeId":"2","nodeName":"LLM","runIndex":0,"finishedAt":"2025-09-23T19:35:13.100Z","status":"success","items":1}
event: workflow:end
data: {"executionId":"ex_789","finishedAt":"2025-09-23T19:35:13.110Z","status":"success"}
Why this belongs in n8n
Streaming node outputs unlocks first-class UX for AI and long-running workflows, enabling modern app patterns (progress bars, token streams, live logs) with minimal glue code. It will make n8n a far better fit for production UI integrations.
Thanks for considering!
If helpful, I’m happy to test a preview build and provide feedback on payload shape and performance.