Troubleshooting Pinecone Partial Insert: 2583 Items In, Only 384 Records Out

Describe the problem/error/question

My n8n workflow is designed to ingest a structured Google Doc into Pinecone for a RAG system. The workflow successfully reads the document and chunks it into 2,583 separate items, which are then passed to the Insert Into Pinecone node.

The problem is that a massive number of these items are lost during the final step. Although 2,583 items go in, the Embeddings Google Gemini sub-node only processes 13 items, and only 384 records are ultimately saved to my Pinecone namespace. The workflow completes without a visible error, but it fails to insert the vast majority of the data. I suspect it might be related to how sub-nodes handle multiple items, but the partial success is confusing.

What is the error message (if any)? no error message

Please share your workflow

Share the output returned by the last node

The last node is Pinecone Vector store with 2584 items, so it is to long to insert

Information on your n8n setup

  • n8n version: 1.114.4
  • Database (default: SQLite): SQLite
  • n8n EXECUTIONS_PROCESS setting (default: own, main): own
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
  • Operating system: Ubuntu OS

It is likely that your workflow is only processing 13 items because the total tokens for those items reach the model’s maximum token limit, causing the rest to be ignored.

To ensure all items are processed, you should loop over your items and send them to the embedding node in smaller batches or one at a time, staying well below the token limit for each request.

1 Like

Thank you so much for your suggestion, that’s very helpful! I want to implement your advice of using a loop, but I’m struggling with exactly where to place it and how to connect it within my existing workflow to solve the issue.

Can you provide me an example of how the loop should be integrated directly into my workflow?

Try this

Just connect the output of Aggregate to the input of Loop Over Items, and then connect the output of Loop Over Items to your next processing step. The Loop Over Items node will handle the iteration

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.