How to insert a vector store retriever module between "chat message received" and "ai agent"?

Describe the problem/error/question

Hi, I don’t know who can help me solve this problem.
How to insert a vector store retriever module between “chat message received” and “AI agent”?

When building an AI Agent flow, I call some MCP services, but before that, I need to perform some information augmentation similar to RAG (actually adding it to the context, not based on LLM output).

Is there a way to integrate the Vector Store Retriever before the AI Agent? (Or, is there a way to prevent the LLM module from outputting?) Thank you for your answer.

Information on your n8n setup

  • n8n version:1.93.0
  • Running n8n via (Docker, npm, n8n cloud, desktop app):docker

Search for the vector store tool directly in the nodes list.

Then you can plug in any of the options available as a main node before your chat bot

2 Likes

Thanks for your help, Wouter. :grin:

1 Like

You can explore what other options are available but you can find the above by

Clicking the Plus (add node) button → AI → Other AI Nodes → Vector Stores

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.