Ai agent tool connectivity to PostgreSQL

Hi,
I’m trying to create a simple ai sql assistant with ai agent.

  1. Is there a way to reuse the postgre credentials instead of defining new hardcoded ones in the agent tool to connect to the db?
  2. is there any specific code for the postgre connectivity (better python or JavaScript?)

End goal is to take simple query, agent to list the table and run the select query itself. I could not find a template that match that use case.
Thanks in advance.

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

My system prompt is like this:

You are a SQL assistant you will receive request from another AI agent 
and you will built and search the database via function 
and return the response to the AI agent. 
you can use the function DB_Connect to connect to the DB 
and retrieve the DB schema and run the SQL select.
The DB_connect tool has 2 functions
connect_to_db to connect to the database
and get_db_schema to retrieve the database schema.
Before doing query you must connect to the database
and retrieve the database schema.

My DB_connect tool is based on python:

import psycopg2

def connect_to_db():
    try:
        # Establish a connection to the database
        conn = psycopg2.connect(
            host=mydbhost,
            database=mydb,
            user=myusername,
            password=mypassword
        )
        
        # Create a cursor object
        cur = conn.cursor()
        
        # Print a success message
        print("Connected to the database successfully.")
        
        return conn, cur
    
    except psycopg2.OperationalError as e:
        print(f"Failed to connect to the database: {e}")
        return None, None

def get_db_schema(conn, cur):
    try:
        # Get the list of tables in the database
        cur.execute("SELECT table_name FROM information_schema.tables WHERE table_schema='public'")
        tables = cur.fetchall()
        
        # Create a dictionary to store the schema
        schema = {}
        
        # Iterate over each table
        for table in tables:
            table_name = table[0]
            schema[table_name] = {}
            
            # Get the list of columns in the table
            cur.execute(f"SELECT column_name, data_type FROM information_schema.columns WHERE table_name='{table_name}'")
            columns = cur.fetchall()
            
            # Add the columns to the schema
            schema[table_name]['columns'] = []
            for column in columns:
                schema[table_name]['columns'].append({
                    'name': column[0],
                    'type': column[1]
                })
            
            # Get the list of indexes in the table
            cur.execute(f"SELECT indexname, indexdef FROM pg_indexes WHERE tablename='{table_name}'")
            indexes = cur.fetchall()
            
            # Add the indexes to the schema
            schema[table_name]['indexes'] = []
            for index in indexes:
                schema[table_name]['indexes'].append({
                    'name': index[0],
                    'definition': index[1]
                })
        
        return schema
    
    except psycopg2.Error as e:
        print(f"Failed to get the database schema: {e}")
        return None

Hey @mraniki , as I understand, you want to use AI SQL Agent. There should be no problem reusing the Postgres credentials you already saved for your PosgreSQL server you intend to query by the agent. No special connectivity, just the same as in Postgres credentials | n8n Docs. The restriction is “the Agent node doesn’t support SSH tunnels”.

Hmm but how do I do that? I cannot connect ai agent tool to my postgre component or connect the code to the postgre component. What am I missing ?

I’m encountering a similar issue, though not specific to the AI Agent tool, but rather with connecting to PostgreSQL in general.

I tried removing all my credentials and restarting my cloud instance (n8n Cloud, version 1.70.1). However, after this, I noticed that the system won’t save my credentials—it gets stuck in a loading state when attempting to save the new credentials.

I’m currently investigating if this might be related to changes in n8n’s new IP whitelist settings, though I suspect it might not be the cause.

Here the flow I have below:
is there a boiler plate code to connect to my postgre component ? or connect it to the agent ? or any agent tool that can be reused ?

ok I’m stupid I did not realise that postgre tool was directly available to connect to the agent.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.