Serverless LLM providers? (quick addition)

The idea is:

A node that’s capable of integrating with external OpenAI compatible serverless LLM providers. Examples are like: together.ai, mystic.ai, predibase.

My use case:

I want to connect to not just openAI api itself, but other openai compatible serverless LLM providers like above, as I want to use fine-tuned models from there.

I think it would be beneficial to add this because:

It’s extremely quick to add. The API structure is identical to OpenAI except the base URL.
We need an “openai compatible credential”, then you can basically use ANY openai compatible serverless LLM providers. That’s like 10 birds with one stone (or pebble).
It’d also open up open-source models beyond hugging face.

Any resources to support this?

See the “Get started with Inference” section.

Are you willing to work on this?

Absolutely. I’m an engineer myself.

I’m just adding, that I’d love to have options to others. As openAI grows, and we learn of the abuses they will surely have our our data, I’d like to be able to connect to self hosted, open source options etc.

I just discovered N8N and have been using Together.ai for a few weeks due to its cost effectiveness. I can’t believe there is no generic openAPI compatible node?

Must I hand craft everything using javascript or something? Will that then work and interoperate with the other AI nodes?