The idea is:
Add DeepInfra language model and text embedding sub-nodes similar to the use of Ollama and OpenAI. Alternatively, enable DeepInfra credentials to be used with OpenAI Chat Model node since DeepInfra has an OpenAI compatible API.
My use case:
I would like to have access to LLM/text-embedding end-point providers that is competitive on price and performance.
I think it would be beneficial to add this because:
DeepInfra has one of the best price and performance when it comes to LLM end-point providers. It would allow people more options to choose their production-level LLM and embedding end-point providers.
Any resources to support this?
DeepInfra already offers an OpenAI compatible API. Just need to have an n8n credential that aligns with DeepInfra, and we can start using DeepInfra.
Are you willing to work on this?
Yes. Happy to test.