Questions and answer chain node is not using the vector store retreiver results

Describe the problem/error/question

I’m using a simple workflow that uses “Question and answer chain” node that is connected to a Pincone vector store and have a chat message input.
My problem is that i’m getting results from the vector store but the they don’t seem to be used as context for my chat model.

Please share your workflow

{
“nodes”: [
{
“parameters”: {
“model”: “text-embedding-ada-002”,
“options”: {}
},
“type”: “@n8n/n8n-nodes-langchain.embeddingsOpenAi”,
“typeVersion”: 1.2,
“position”: [
120,
620
],
“id”: “24a95124-29b6-4c77-95cc-49389c72935d”,
“name”: “Embeddings OpenAI”,
“credentials”: {
“openAiApi”: {
“id”: “SO1G7Uhoa1vePIWz”,
“name”: “OpenAi account”
}
}
},
{
“parameters”: {
“pineconeIndex”: {
“__rl”: true,
“value”: “c92270a6-5826-4d82-8cae-fd95628ab1a1-requests”,
“mode”: “list”,
“cachedResultName”: “c92270a6-5826-4d82-8cae-fd95628ab1a1-requests”
},
“options”: {
“pineconeNamespace”: “c92270a6-5826-4d82-8cae-fd95628ab1a1-requests”
}
},
“type”: “@n8n/n8n-nodes-langchain.vectorStorePinecone”,
“typeVersion”: 1,
“position”: [
300,
420
],
“id”: “e5ad4537-89f4-4f80-b9de-bb918e47e45c”,
“name”: “Pinecone Vector Store”,
“credentials”: {
“pineconeApi”: {
“id”: “WqIAfFw3vpo6R6Jn”,
“name”: “PineconeApi account”
}
}
},
{
“parameters”: {},
“type”: “@n8n/n8n-nodes-langchain.retrieverVectorStore”,
“typeVersion”: 1,
“position”: [
340,
260
],
“id”: “9b484b04-bad2-419a-9ade-2641a6cfcd43”,
“name”: “Vector Store Retriever”
},
{
“parameters”: {
“options”: {}
},
“type”: “@n8n/n8n-nodes-langchain.chatTrigger”,
“typeVersion”: 1.1,
“position”: [
20,
-20
],
“id”: “806088fa-ac93-4bb1-b7ff-df360cb163b7”,
“name”: “When chat message received”,
“webhookId”: “a73857fd-6d06-4b06-a44b-c1b0059c2a9d”
},
{
“parameters”: {
“options”: {}
},
“type”: “@n8n/n8n-nodes-langchain.chainRetrievalQa”,
“typeVersion”: 1.4,
“position”: [
240,
-20
],
“id”: “2e8e153c-a5c9-4d34-b053-ca1f7d782986”,
“name”: “Question and Answer Chain”
},
{
“parameters”: {
“model”: {
“__rl”: true,
“value”: “gpt-4o”,
“mode”: “list”,
“cachedResultName”: “gpt-4o”
},
“options”: {}
},
“type”: “@n8n/n8n-nodes-langchain.lmChatOpenAi”,
“typeVersion”: 1.2,
“position”: [
160,
200
],
“id”: “6a6b17cb-40f2-4bb5-aefe-9116adc17338”,
“name”: “OpenAI Chat Model”,
“credentials”: {
“openAiApi”: {
“id”: “SO1G7Uhoa1vePIWz”,
“name”: “OpenAi account”
}
}
}
],
“connections”: {
“Embeddings OpenAI”: {
“ai_embedding”: [
[
{
“node”: “Pinecone Vector Store”,
“type”: “ai_embedding”,
“index”: 0
}
]
]
},
“Pinecone Vector Store”: {
“ai_vectorStore”: [
[
{
“node”: “Vector Store Retriever”,
“type”: “ai_vectorStore”,
“index”: 0
}
]
]
},
“Vector Store Retriever”: {
“ai_retriever”: [
[
{
“node”: “Question and Answer Chain”,
“type”: “ai_retriever”,
“index”: 0
}
]
]
},
“When chat message received”: {
“main”: [
[
{
“node”: “Question and Answer Chain”,
“type”: “main”,
“index”: 0
}
]
]
},
“OpenAI Chat Model”: {
“ai_languageModel”: [
[
{
“node”: “Question and Answer Chain”,
“type”: “ai_languageModel”,
“index”: 0
}
]
]
}
},
“pinData”: {},
“meta”: {
“templateCredsSetupCompleted”: true,
“instanceId”: “c52aee3b038b10b2a3e35f4e2636449ac19d523af47b48b8ff439fc46ea9ac62”
}
}

Share the output returned by the last node

Information on your n8n setup

  • **n8n version: n8n cloud

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hi @simbel
I tested your workflow on my Cloud version (1.79.3) and seems to work.
I only had to remove the “Pinecone Namespace” as I only have the default one in Piencone.

A couple of questions:

  • which version of n8n are you on?
  • what do you see in the logs of the “Question and Answer Chain” agent? You should see the data retrieved from Pinecone passing through the Vector Store and then to the model

Hi @giulioandreini ,

Thank you for your help.

These are my n8n infos:

core

  • n8nVersion: 1.79.3
  • platform: docker (cloud)
  • nodeJsVersion: 20.18.2
  • database: sqlite
  • executionMode: regular

I have results coming from pinecone but they are not used in the OpenAI Chat Model for answering the user question.

pageContent there is empty and that’s where the data from Pinecone should be:


could you try to remove the Pinecone Namespace option and see if works?

I attach my workflow here for reference:

@giulioandreini

The problem is that in my indexed documents i didn’t put a “text” property in my metadatas fields.
N8n seems to use this field for pageContent.

Thank you again for your help

1 Like

Hi @simbel
at the moment, our Pinecone implementation only supports text as a textKey for the content, it is not possible to use different property names.
I created an internal issue to fix that.

Hello @giulioandreini, @simbel

I have having the exact same problem but I don’t understand how you solved it.
Could you explain please?

My retriever gets answers (thks to your explanations above => I suppressed the namespaces) but Question and Answer Chain does not get it back and don’t use it)

Thks !!

My workflow

My vector store retriever gets this

And the (no) result

Hi @edriwing
you’re probably encountering the same issue I mentioned here:

If Pinecone is returning the data in a different field then the Agent Question and Answer Chain will not see that,
this is something that’s in our backlog