Describe the problem/error/question
I’m building a chatbot that grabs docs from Confluence via API, then loads them into Pinecone. Leveraging OpenAI for embedding. It all works great BUT if I ask the Chatbot to return links to source documents it returns hallucinated URL links. I have a URL metadata filed in my Pinecone instance, but that doesn’t help.
What is the error message (if any)?
Please share your workflow
{
“nodes”: [
{
“parameters”: {
“public”: true,
“options”: {}
},
“type”: “@n8n/n8n-nodes-langchain.chatTrigger”,
“typeVersion”: 1.1,
“position”: [
0,
0
],
“id”: “67347d0b-c77d-4778-9f6b-ad902ceddb3a”,
“name”: “When chat message received”,
“webhookId”: “eeca15fc-1795-43be-b5da-b0b344423b2f”
},
{
“parameters”: {
“options”: {
“systemMessage”: “Tone:\n- Maintain a supportive, conversational and understanding tone to make the user feel valued.\n- Your responses must be concise but descriptive, ideally sufficient content to answer the question and provide context, but not overly verbose. \n\nContent\n- if asked for an overview of features, be sure to include vehicle management, driver management, along with all the other features, including those from the Admin Page. These are in your documentation separated by pages. \n\nReferencing:\n- provide the url information for any relevant documents found in the ‘url’ metadata field”
}
},
“type”: “@n8n/n8n-nodes-langchain.agent”,
“typeVersion”: 2,
“position”: [
252,
0
],
“id”: “c8969691-e322-4d35-9bc7-829118f22e35”,
“name”: “AI Agent”
},
{
“parameters”: {
“model”: {
“__rl”: true,
“value”: “o3-mini”,
“mode”: “list”,
“cachedResultName”: “o3-mini”
},
“options”: {}
},
“type”: “@n8n/n8n-nodes-langchain.lmChatOpenAi”,
“typeVersion”: 1.2,
“position”: [
220,
220
],
“id”: “0b5be9f5-d543-4c44-8975-7b7d0e89e12e”,
“name”: “OpenAI Chat Model”,
“credentials”: {
“openAiApi”: {
“id”: “fHVRzMh4bOAVFg84”,
“name”: “OpenAi account”
}
}
},
{
“parameters”: {},
“type”: “@n8n/n8n-nodes-langchain.memoryBufferWindow”,
“typeVersion”: 1.3,
“position”: [
340,
220
],
“id”: “073f997d-b8e2-441f-b807-24ea34c61210”,
“name”: “Simple Memory”
},
{
“parameters”: {
“mode”: “retrieve-as-tool”,
“toolName”: “=confluence_product”,
“toolDescription”: “Product space”,
“pineconeIndex”: {
“__rl”: true,
“value”: “sublime-eucalyptus”,
“mode”: “list”,
“cachedResultName”: “sublime-eucalyptus”
},
“options”: {
“pineconeNamespace”: “product-sensei”
}
},
“type”: “@n8n/n8n-nodes-langchain.vectorStorePinecone”,
“typeVersion”: 1.2,
“position”: [
460,
222.5
],
“id”: “3f392271-7dad-4b1d-ac80-d755a05aec4a”,
“name”: “Pinecone Vector Store2”,
“credentials”: {
“pineconeApi”: {
“id”: “wSyNwkjPmijxVJoc”,
“name”: “PineconeApi account”
}
}
},
{
“parameters”: {
“options”: {
“dimensions”: 1024
}
},
“type”: “@n8n/n8n-nodes-langchain.embeddingsOpenAi”,
“typeVersion”: 1.2,
“position”: [
460,
380
],
“id”: “f6e299bf-ff17-4366-b82e-f2083fc3ea3e”,
“name”: “Embeddings OpenAI”,
“credentials”: {
“openAiApi”: {
“id”: “fHVRzMh4bOAVFg84”,
“name”: “OpenAi account”
}
}
}
],
“connections”: {
“When chat message received”: {
“main”: [
[
{
“node”: “AI Agent”,
“type”: “main”,
“index”: 0
}
]
]
},
“OpenAI Chat Model”: {
“ai_languageModel”: [
[
{
“node”: “AI Agent”,
“type”: “ai_languageModel”,
“index”: 0
}
]
]
},
“Simple Memory”: {
“ai_memory”: [
[
{
“node”: “AI Agent”,
“type”: “ai_memory”,
“index”: 0
}
]
]
},
“Pinecone Vector Store2”: {
“ai_tool”: [
[
{
“node”: “AI Agent”,
“type”: “ai_tool”,
“index”: 0
}
]
]
},
“Embeddings OpenAI”: {
“ai_embedding”: [
[
{
“node”: “Pinecone Vector Store2”,
“type”: “ai_embedding”,
“index”: 0
}
]
]
}
},
“pinData”: {},
“meta”: {
“templateCredsSetupCompleted”: true,
“instanceId”: “b5b4dc2bce3c9ee12c06b94bfac732bb8870024500a0817c997fd3fa41af40bf”
}
}
Share the output returned by the last node
Information on your n8n setup
-cloud
-version [email protected]