Issue with AI Agent not reading full PostgreSQL data

Hello everyone,

I’m running into an issue with PostgreSQL in n8n that I can’t figure out.

My PostgreSQL database has about 20,000 records of sales and stock data. The problem is that my AI Agent doesn’t seem to read the entire dataset.

For example:

  • When I ask for the highest sales value, it sometimes returns a middle or lower value instead of the actual highest.

  • In other cases, it claims certain records don’t exist, even though they are clearly present in the database.

The SQL queries themselves are fine — I tested the exact same queries generated by the AI Agent directly on PostgreSQL, and they return the correct results.

I’ve tried this both with a local PostgreSQL instance and Supabase PostgreSQL, but the problem persists in both setups.

Setup details:

  • Database: PostgreSQL

  • n8n EXECUTIONS_PROCESS setting: own (default)

  • Running n8n via npm (Windows Command Prompt)

  • OS: Windows 10

Has anyone else experienced this issue, or know why the AI Agent might not be processing the full dataset?

Thanks in advance for your help!

Hey @CrizCJK hope all is good. Welcome to the community.

I see no reason for AI agent to read the entire database.

AI agent chooses the tool, generates the query and send the query to the tool. The tool sends the query to the database, gets the results and sends them back to AI agent. Nowhere in this process AI Agent reads the entire database.

Would you like to share your workflow so that we have something concrete to talk about?

1 Like

Thanks for respondong, @jabbson. here is my workflow.
As an additional information, I already tried 3 AI model, Gemini, Open AI, Cohere, all not resolved the issue.
I hope to find the solution for this issue.

Nope, still a screenshot.

Sorry, I have posted my workflow on prev post. Thanks for your help @jabbson.

I appears that your tool is limited to catching only 20 results.

and no filters are applied, which means that the tool will attempt to get 20 results and return it to the agent which will perform transformations or calculations requested by the user.

I tried to change the limit to 20,000 (same as total records), and the process become very lagg.

I thought the limit work based on query, no need to fetch all the whole records. for example below query.

SELECT *
FROM sales_data
ORDER BY sales_amount DESC
LIMIT 1;

Is there any other alternatives? Thanks.

1 Like

I found a workaround by using function created on the postgre and call it with http get node.

2 Likes

Yeah, you don’t want 20000 responses, you want to do as much as possible on the postgres side as opposed to processing everything in LLM, as it will be faster and cheaper to do it on postgres server using their optimized searching and filtering mechanisms. An alternative to running pre-defined saved procedures on postgres, you could I believe also send a query via api, or use PostgREST.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.