Describe the question
I want to implement a chatbot to talk to my documents, stored in a Pinecone database.
When the bot gives an answer, I want him to give me the name of the document(s) he used.
I’ve created a workflow with a AI Agent
and a Vector Store Tool
, but the answer provided by the Model of the Vectore Store Tool don’t use the meta data to make it’s answer.
In his review (Review: Vector Store Tool (+ Langchain Code Alternative!)), @Jim_Le said:
- Additionally, if you use the metadata as part of the response and not only for filtering, then you may lose this information in the tool’s LLM response as well.
- It’s not possible to modify the generic prompt instructions of the tool’s LLM. This means a lack of post-retrieval processing before sending the tool’s response and possibly a fix for the above.
which seems to be my problem.
@Jim_Le kindly gave an alternative using LangChain Code
node, but this node is not available in the cloud version of N8N.
LangChain Code node documentation | n8n Docs
I also tried to add the filename in every chunk with <filename>my_file.pdf</filename>
and asked the AI Agent to use it, but since the agent prompt is not passed to the Model of the Vectore Store Tool, this information never go to the AI Agent chat model.
My question is: How can I have the name of the source used to make the final answer?
Please share your workflow
Information on your n8n setup
- n8n version: 1.60.0
- Database (default: SQLite): Pinecone
- n8n EXECUTIONS_PROCESS setting (default: own, main): ?
- Running n8n via (Docker, npm, n8n cloud, desktop app): cloud
- Operating system: cloud