Using MongoDB with LLM

Describe the problem/error/question

New to using AI and espacially within n8n. I’m able to get this to work loading sample data in Supabase, but were planning on loading our data into MongoDB. Using the Supabase it just an an agent that builds a query for me.

What route would I do something similar here with Mongo?

What is the error message (if any)?

Subflow:

Parent flow

Share the output returned by the last node

Information on your n8n setup

  • n8n version: 1.34.2
  • Database (default: SQLite): none
  • n8n EXECUTIONS_PROCESS setting (default: own, main): n/a
  • Running n8n via (Docker, npm, n8n cloud, desktop app): n8n cloud
  • Operating system: none

I guess what I’m needing to do is using the AI agent prompt to build a JSON query that look something like this and be able to call the MongoDB integration, but it looks like the node is only able to find the collection itself

{
  "find": "clients",
  "filter": {
    "Client_First_Name__c": "Test",
    "Client_Last_Name__c": "Tester"
  },
  "projection": {
    "Client_Email__c": 1
  }
}

Hey @Philip_Wiggins,

The tricky bit here is I don’t think we support MongoDB as an option for the models / embeddings / memory but in theory if you don’t want to chat with your MongoDB data you should be able to use the output of the AI Agent and use it with the nomral mongodb node.

The more complicated way if you wanted to chat with the data in MongoDB would be to make a workflow that you trigger as a tool, We kind of cover part of this in our getting started with AI guide here: Tutorial: Build an AI workflow in n8n | n8n Docs

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.