Before I give up completely on n8n (Incomplete / inaccurate answer)

Hi y’all,

How do you get the AI read all the data and actually give a whole and complete answer list accordingly to the database?

It seems like the chatbot I’ve created is not able to give a whole and complete data when asked to list items that has a certain price point.

For context, the original database is from google sheet that has been inserted to supabase, with only 1 column for object, and one column for price, with total of 2059 rows. So I don’t know if this is a limitation from the platform (either n8n or supabase) or my prompt, or the openAI.

I have attached couple of cases that you guys can take a look as well on what I meant and the workflow

I’m not coming from a technical background, so I have no idea what to fill this n8n version info. All I know is that my n8n is self hosted (got help from a friend)

  • n8n version: Version 1.79.3 maybe
  • Database (default: SQLite): supabase with postgressql as memory
  • n8n EXECUTIONS_PROCESS setting (default: own, main): not sure
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
  • Operating system: OS Windows 8.1

Abang, I think the main problem may be from not using the optimal toolset for achieving what you want.
If you have a database with products and prices, putting them in a vector-db and querying with AI only will not output exact results as you already see for yourself.

Keep in mind that:

  • Vector Databases Are Designed for Similarity, Not Exact Matching
  • Product Prices Require Precision, Not Approximation

I recommend adding a SQL-tool for querying your product data, as this will give much better results (eg mySQL).
A relational database (SQL, PostgreSQL, MySQL) is better suited for storing structured product data where exact lookups are needed.

Let me know if you need support in this :slight_smile:

1 Like

Hey man, thank you for your answer.

I see the problem now. Do you think adding an extra tool such as airtable or google sheet will help?

to be honest, I would not recommend using Google Sheets as a database substitute as you may run into various problems (eg rate-limit), especially when working with larger datasets.

Airtable might be well worth a try, as you can easily query your data in a structured way. You may need to play around a bit with using multiple tools together and/or passing the agents output to another LLM-node for further refinement of the final answer.

1 Like

FWIW, I have been quite pleased with NocoDB. It’s based on SQLite which might be a disadvantage for large amounts of data but has the giant advantage (to me) that you can back up or move it to a different server by simply copy and pasting the files.

Regarding n8n in general, I’ve had to work through some quirks but once I get a flow working, it tends to be rock solid.

@RichardC actually NocoDb supports various database systems, including MySQL, PostgreSQL, Microsoft SQL Server, SQLite, Amazon Aurora, and MariaDB.
I personally use PostgreSQL in my instance.

It’s a good solution for a not technical background person. @bimabima just ask your friend once more to help you install it alongside your n8n instance. There are tons of NocoDB guides on YouTube.

2 Likes

Got it, I’ll take a look into Airtable and see if it could work for 2k rows of data.

I’ll surely contact you if I need to consult this deeper.

Thanks once again mate

2 Likes

Thanks for the pointer guys @RichardC @Ruslan_Yanyshyn , I’ll take a look at NocoDB and PostgreSQL and see what I can come up with.

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.