OpenAI Embeddings Node zeigt keine Embedding-Modelle an / 400-Fehler bei text-embedding-3-small (Self-hosted Docker)

Hi everyone,

I’m struggling with the OpenAI Embeddings node in my self-hosted n8n instance (Docker, latest image, OpenAI Embeddings Node v1.2).


Issue

  • In the Embeddings node, no embedding models (like text-embedding-ada-002 or text-embedding-3-small) appear in the dropdown—only chat/completion models such as GPT-4, GPT-4o, etc.

  • If I manually enter text-embedding-3-small as the model, I get the error:
    400 status code (no body)

  • My OpenAI API key is valid and works with other OpenAI nodes.

  • Pinecone is correctly connected and working.

  • There is no “Use Response API” option in the Embeddings node.

  • I’m running n8n as a Docker container (not via docker-compose).


What I’ve tried

  • Pulled the latest n8n image and restarted the container multiple times (docker pull n8nio/n8n:latest, etc.).

  • Confirmed that the Embeddings node is present in the workflow.

  • Checked if other embedding models are available (none are).

  • Tested my API key directly with OpenAI (it works).

  • Tried various credentials and settings combinations.


Additional Observations / UI Behavior

  • When I first open the Embeddings node, the input on the left side (e.g., from the Markdown node) is displayed correctly.

  • After closing and reopening the node, or after running the workflow, the error occurs and the input on the left side is no longer visible (see attached screenshots).

  • This is confusing, as the input is initially passed through correctly.


Questions

  • Is this behavior known?

  • Are there limitations for embedding models in the Embeddings node for self-hosted/Docker setups?

  • Is there a way to get embedding models to show in the dropdown?

  • Is this a setup issue or a general bug?

  • Is there a workaround to use embeddings in n8n and send them to Pinecone?

Looks like this might be related to the OpenAI Embeddings node version or API compatibility. Since you’re using the latest n8n Docker image, try explicitly setting the `OPENAI_API_VERSION` environment variable to `2023-05-15` in your Docker run command. This API version should support the embedding models correctly.

If that doesn’t work, you could also try manually entering `text-embedding-ada-002` as the model name, as it’s widely supported and might bypass the dropdown issue. Let me know if this helps!

thank you, achamm - setting the value manually to “text-embedding-ada-002” did the magic.

2 Likes

Your welcome! If you’d like, feel free to set one of our replies as the solution! Have a great day!