Describe the problem/error/question
My n8n workflow is designed to ingest a structured Google Doc into Pinecone for a RAG system. The workflow successfully reads the document and chunks it into 2,583 separate items, which are then passed to the Insert Into Pinecone node.
The problem is that a massive number of these items are lost during the final step. Although 2,583 items go in, the Embeddings Google Gemini sub-node only processes 13 items, and only 384 records are ultimately saved to my Pinecone namespace. The workflow completes without a visible error, but it fails to insert the vast majority of the data. I suspect it might be related to how sub-nodes handle multiple items, but the partial success is confusing.
What is the error message (if any)? no error message
Please share your workflow
Share the output returned by the last node
The last node is Pinecone Vector store with 2584 items, so it is to long to insert
Information on your n8n setup
- n8n version: 1.114.4
- Database (default: SQLite): SQLite
- n8n EXECUTIONS_PROCESS setting (default: own, main): own
- Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
- Operating system: Ubuntu OS