The idea is:
We should be able to set the
endpoint in the Mistral Cloud credentials, and this should be used by the
Mistral Cloud Chat Model and
Embeddings Mistral Cloud nodes.
This should be relatively trivial since
MistralAIEmbeddings have an
endpoint parameter which is not currently being used.
My use case:
We have a self hosted Mistral model which is interactable via the Mistral API, so this should work out of the box with n8n Mistral Cloud nodes, only if it had an
endpoint configuration option.
I think it would be beneficial to add this because:
It would make it possible for people to use self hosted Mistral API compatible models with Mistral Cloud nodes and consequently with the n8n LLM ecosystem.
Any resources to support this?
Are you willing to work on this?
Yes, I would.
Made a PR that should add this functionality:
Hello, why is the MistralAI model not compatible with the"AI Agent" node?
It could be that we just have not got around to adding that yet.