LocalAI is a project to run locally LLM and expose an API compatible with OpenAI API. The implementation can be to have a new LocalAI node that would be almost exactly the same as the OpenAI ones, or to add a “OPENAI endpoint” set to default to OpenAI uri and allow the user to change it to some other URI (like the localAI hosted locally)
I would like to run LLM locally using LocalAI which has a lot of LLM engine backends
It is an alternative to Ollama models that support a variety of other LLMs.
https://localai.io/ has plenty of documentation
Sorry, I don’t have enough time.