Describe the problem/error/question
Hi all,
My team is running a self-hosted n8n instance with an Enterprise License enabled.
I want to replicate all production workflow execution events into Kafka. Later, this data will be used to analyze platform usage and other metrics. Solution needs to be as reliable as possible.
Currently, I am considering two options: using PostgreSQL logical replication or n8n Log Streaming (Log streaming | n8n Docs). Log Streaming supports a generic webhook as a destination, which could be used to publish events to Kafka.
I am wondering how reliable this architecture would be. What happens if Kafka is temporarily unavailable? Will events be queued and sent once the connection to Kafka is restored, or will those events be lost?
Information on your n8n setup
- n8n version: 2.4.4
- Database (default: SQLite): PostgreSQL
- n8n EXECUTIONS_PROCESS setting (default: own, main): queue mode (with workers)
- Running n8n via (Docker, npm, n8n cloud, desktop app): self-hosted in kubernetes
- Operating system: Linux