Serverless LLM providers? (quick addition)

The idea is:

A node that’s capable of integrating with external OpenAI compatible serverless LLM providers. Examples are like:,, predibase.

My use case:

I want to connect to not just openAI api itself, but other openai compatible serverless LLM providers like above, as I want to use fine-tuned models from there.

I think it would be beneficial to add this because:

It’s extremely quick to add. The API structure is identical to OpenAI except the base URL.
We need an “openai compatible credential”, then you can basically use ANY openai compatible serverless LLM providers. That’s like 10 birds with one stone (or pebble).
It’d also open up open-source models beyond hugging face.

Any resources to support this?

See the “Get started with Inference” section.

Are you willing to work on this?

Absolutely. I’m an engineer myself.