Local Chat with AI Agent - Ollama and Qdrant

Hey everyone,

I’m planning to build a chat agent using Ollama, with associated data stored in a Qdrant database.

I’m currently testing the following workflow: RAG – AI Agent.

However, I’m not receiving any data or information from the Qdrant database. In the Qdrant Vector Store node, I noticed that the fields “pageContent” and “metadata” are empty, while in the Qdrant database the field is named “text”.

My question is: Is this workflow approach correct, and how can I modify the mapping?

Information on your n8n setup

  • n8n version: 1.79.3 (Community Edition)
  • Database (default: SQLite): qdrant (v1.13.4)
  • n8n EXECUTIONS_PROCESS setting (default: own, main): main
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker Desktop
  • Operating system: Windwos 2022 with WSL

Thx for helping!

br,
LTJ

@LTJ i wanna know, Are you using LangChain format for your Qdrant collection or a custom schema?

Did you tried, The field mapping issue occurs because n8n’s Qdrant node expects LangChain format (pageContent and metadata), but your database uses text.

Add a Function node before the Qdrant Vector Store node to map your fields:

// Map from 'text' field to LangChain expected format
return items.map(item => ({
  json: {
    query: item.json.query,
    vectors: item.json.vectors.map(vec => ({
      pageContent: vec.text || "",  // Map from 'text' to 'pageContent'
      metadata: vec.metadata || {}  // Keep or create metadata
    }))
  }
}));

This converts your Qdrant data format to the structure the RAG workflow expects.

Hello Yo_its_praktash,

big thanks for helping and the explanation!

I created the data (PDFs) with vectorize.io for Qdrant - but will try a custom schema - thanks for the tip!

That with a function node before Qdrant Vector Store - unfortunately I have not found a possibility - to use this node and to connect correctly :frowning:

thx,
LTJ

1 Like