Openrouter Embeddings Model Node

The idea is:

Creating an Embedding Model Node for Openrouter

My use case:

I use all my language models through openrouter, it already has a chat model node and it makes my life so much easier by being able to switch between vendors and models without the need for several credentials.
OpenRouter announced that they’re supporting embedding models as of today and for this reason an “openrouter embedding model” node would be so necessary now.

I think it would be beneficial to add this because:

This way, openrouter users won’t have to mantain a second credential and / or a second set of api keys in vendors such as OpenAI or cohere, but insted have the ability to route the embedding calls through their service.
this is a follow up on this issue: How to use embedding models with OpenRouter - #2 by n8n

Are you willing to work on this?

I don’t have the skills for this but it must be a very simple adaptation from current nodes

I support this idea. As a temporary workaround, you can use the OpenAI Embeddings node, provide your OpenRouter API key, and set the Base URL to https://openrouter.ai/api/v

2 Likes

The workaround not working for me. Still need a openrouter embedding node for solution.

This workaround does not seem to work for me either. Although I’m using https://openrouter.ai/api/v1/embeddings as my base URL (and can see available embeddings models correctly), I end up receiving a MODEL_NOT_FOUND error (which seems to originate from the underlying LangChain library).

It works when you strip the endpoint back to https://openrouter.ai/api/v1 even if you loose visually comfortable ability to select an embedding model from the drop-down list. Instead after that you have to specify the desired embedding model manually (as expression, in my case “openai/text-embedding-3-small”) and here you go.

1 Like
Try set endpoint https://openrouter.ai/api/v1 and specify the desired embedding model manually