Making Ollama Chat Model ONLY Form Answers from Browsing Vector Store

Problem:

I’m making a “chatbot” with Ollama (llama3.2) that allows the user to upload a PDF file to have its contents extracted and converted to embeddings (using Nomic) to upload to a Supabase Vector Store.

I got the Vector Store part done, but I want the chatbot to answer questions related to the document via EXCLUSIVELY browsing the stored vectors, no extra fluff, hallucinations or generic everyday answers. Just straight answers.

How can I achieve that?

Workflow:

N8N Setup

  • n8n version: 1.76.1
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker Desktop
  • Operating system: Windows 10 (n8n locally in Linux Alpine)

You typically usually just have to be extremely explicit about only using its provided vector store in the system prompt.

I would improve the system prompt greatly, and/or try provide the context in a user message instead.