How to change OpenAI embedding dimensions to 256?

Describe the problem/error/question

I have a database on a Qdrant with a vector of 256 dimensions. The embedding was done using the “text-embedding-3-large” model with the dimensions parameter set to 256.
Now I’m building a workflow with N8N with an agent that uses the database as knowledge.
In the “Qdrant Vector Store” sub-node, I inserted the “Embeddings OpenAI” node as the embedding. How can I set the 256 dimensions instead of the default 3072?

What is the error message (if any)?

This is the error I receive from the “Qdrant Vector Store” node:
Bad Request
Error cause:

{ “headers”: {}, “url”: “http://XXX.XXX.XXX.XXX:6333/collections/my-collection-name/points/search”, “status”: 400, “statusText”: “Bad Request”, “data”: { “status”: { “error”: “Wrong input: Vector dimension error: expected dim: 256, got 3072” }, “time”: 0.003645948 } }

Please share your workflow

Share the output returned by the last node

Information on your n8n setup

  • n8n version: 1.60.1
  • Database (default: SQLite): Postgres
  • n8n EXECUTIONS_PROCESS setting (default: own, main): own
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
  • Operating system: Ubuntu 22.04.5

Hi @Andrea_Ruffini

You will have to use the same model for retrieval that was used for the embedding in order to set the same dimensions. So you will need to select the text-embedding-3-large model in the Qdrant Vector Store sub-node as well.

Hi @ria,

Thank you for your reply. Actually, the OpenAI text-embedding-3-large model allows setting the dimensions parameter with different options (256, 1536, 3072), not just one. In my case, the embedding was generated using the text-embedding-3-large model with dimensions set to 256.

It would be very helpful to have the ability to set this parameter in the Embeddings OpenAI node as well, to make it more flexible.

Did you find a solution?

I have the same problem with my Postgres PGVector Store. Embedded data from a different backend (currently trying to switch to n8n) has dimension 1536 but n8n asks for 4096. I really don’t want to reindex all the data.

Hi guys, I’ve raised this as a feature request. We’ll see what we can do.
Looking at the OpenAI API, this should be doable at least for the text-embedding-3 models.
https://platform.openai.com/docs/api-reference/embeddings/create#embeddings-create-dimensions

1 Like

New version [email protected] got released which includes the GitHub PR 11773.