Has anyone else had this problem? I have a very simple workflow that uses a message a model Open AI node and a connected pinecone database but when the open ai node asks the embedded open ai model on the pinecone node, it just response with IDK… I know the right data is in the pinecone vector store, so I don’t understand why it won’t use it. I’ve been trying to tweak and trouble shoot this for so long but can’t figure it out, would appreciate any help!
It looks like your topic is missing some important information. Could you provide the following if applicable.
- n8n version:
- Database (default: SQLite):
- n8n EXECUTIONS_PROCESS setting (default: own, main):
- Running n8n via (Docker, npm, n8n cloud, desktop app):
- Operating system:
Here is more information. Hopefully someone can help, I have a feeling I am close.
- n8n version:: Latest - Stable
- Database (default: SQLite): Pinecone
- n8n EXECUTIONS_PROCESS setting (default: own, main): Default
- Running n8n via (Docker, npm, n8n cloud, desktop app): n8n cloud
- Operating system: Mac 14.4.1
- *Last output:
Hey @Koga,
Can you be more specific on what it is repsonding with? It may be worth looking at this chat example to see if you have more luck with this sort of approach: Chat with PDF docs using AI (quoting sources) | n8n workflow template
Yes definitely!
When I call the workflow, the Pinecone Vector Store subnode outputs the correct data but the Vector Store Tool for some reason isn’t getting that data. The embedded openai responds with I don’t know and my Message a Model Node responds the same, but sometimes makes up an answer.
Is there a way to share my executions with you?