Problem Summary:
When running in Regular Mode, both Test URL and Production URL work perfectly with streaming responses. However, after switching to Queue Mode (1 Main + 2 Workers), the Production URL stops working while the Test URL continues to work normally.
Environment:
-
n8n version: 1.123.5
-
Deployment: Kubernetes (self-hosted)
-
Redis: Connected and healthy
Behavior Comparison:
Execution Mode Test URL ( /webhook-test/ki-cn)Production URL ( /webhook/ki-cn)Regular Mode
Works with streaming
Works with streamingQueue Mode
Works with streaming
Does NOT work
Workflow includes AI Agent nodes with LangChain for streaming chat responses.
Questions:
-
Is there a way to make Production webhook URLs work in Queue Mode with streaming?
-
Are there specific configurations or environment variables needed?
-
Is there a workaround or recommended architecture?
-
-
If webhook streaming in Queue Mode is officially not supported:
-
Can you please confirm this limitation?
-
Are there any plans to support it in future versions?
-
What is the recommended architecture for scalable streaming webhooks?
-
What I Need:
Either:
-
A solution/configuration to make Production webhooks work in Queue Mode with streaming, OR
-
Official confirmation that this is not supported, so I can plan alternative architectures
Any guidance would be greatly appreciated. This is critical for our AI chatbot deployment where streaming responses and horizontal scalability are both essential requirements.
Thank you!