Getting a out of memory error in Pinecone based RAG Agent workflow

Hi All,

I have created a RAG Agent workflow, in which I have imported the PDF files from Google Drive and save them in the Pinecone Vector Database. It worked well until 60-70 PDF files, but when I tried to import more files, it gave me an error like this: ‘Length too large, found 6223128 Bytes and allowed limit is 4194304’. To resolve this, I updated the ‘Embedded Batch Size’ from 200 to 50, which seems to have resolved the issue for the next 14 files, but after that, not even a single file is processing. I have tried updating it to 30 and 25 as well but it didn’t work.

Existing Pinecone Vector DB config -

Index name - n8nproject
Dimensions - 3072, Metric-cosine
Capacity Mode-Serverless
Type - Dense

Embedding Model - Text-Embedding-3-Large
Total Records in the Index until now - 3336

  • n8n version: - 1.85.4
  • Database (default: SQLite): default
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Executing in n8n App
  • Operating system: Windows 11 Pro

https://jaitltus.app.n8n.cloud/workflow/hOGEFaHvwV0SuoY4

Not sure how do I get the Workflow code?