The new topic PostgreSQL

Hi everyone,

I’ve been working with PostgreSQL in n8n workflows and wanted to share some thoughts and also ask for advice.

Recently, I integrated PostgreSQL as a data source in one of my automation pipelines, mainly to store transactional data and retrieve it for further processing.

One thing I really like is how easy it is to configure the PostgreSQL node and run custom queries directly within the workflow.

However, I’ve noticed that performance can become an issue when dealing with large datasets, especially if queries are not optimized properly.

I started using indexed columns and limiting results with WHERE and LIMIT clauses, which improved execution time significantly.

Another interesting use case I implemented was using PostgreSQL to keep track of workflow states and logs, making debugging much easier.

I’m also curious about best practices for handling connections — do you usually rely on the default configuration or implement connection pooling externally?

Additionally, I’ve been thinking about combining PostgreSQL with queue-based workflows to improve scalability.

Has anyone here tried using PostgreSQL triggers together with n8n webhooks?

I believe that could open interesting real-time automation scenarios.

Also, how do you handle error management when a query fails inside a workflow?

Do you prefer retry logic, fallback nodes, or external monitoring?

Looking forward to hearing your experiences and recommendations!
Thanks in advance :raising_hands:

1 Like

on the connection pooling question, n8n’s Postgres node opens a fresh connection per execution by default so for high-frequency workflows an external PgBouncer is worth the overhead. on the trigger + webhook idea: Postgres has NOTIFY/LISTEN but n8n doesn’t have a native listener node, so the practical pattern is a short-interval Schedule Trigger polling for new rows with a processed_at column instead