The idea is:
We should be able to set the endpoint
in the Mistral Cloud credentials, and this should be used by the Mistral Cloud Chat Model
and Embeddings Mistral Cloud
nodes.
This should be relatively trivial since ChatMistralAI
and MistralAIEmbeddings
have an endpoint
parameter which is not currently being used.
My use case:
We have a self hosted Mistral model which is interactable via the Mistral API, so this should work out of the box with n8n Mistral Cloud nodes, only if it had an endpoint
configuration option.
I think it would be beneficial to add this because:
It would make it possible for people to use self hosted Mistral API compatible models with Mistral Cloud nodes and consequently with the n8n LLM ecosystem.
Any resources to support this?
constructor(fields?: ChatMistralAIInput) {
super(fields ?? {});
const apiKey = fields?.apiKey ?? getEnvironmentVariable("MISTRAL_API_KEY");
if (!apiKey) {
throw new Error(
"API key MISTRAL_API_KEY is missing for MistralAI, but it is required."
);
}
this.apiKey = apiKey;
this.streaming = fields?.streaming ?? this.streaming;
this.endpoint = fields?.endpoint;
this.temperature = fields?.temperature ?? this.temperature;
this.topP = fields?.topP ?? this.topP;
this.maxTokens = fields?.maxTokens ?? this.maxTokens;
this.safeMode = fields?.safeMode ?? this.safeMode;
this.randomSeed = fields?.randomSeed ?? this.randomSeed;
this.modelName = fields?.modelName ?? this.modelName;
}
_llmType() {
return "mistral_ai";
endpoint?: string;
constructor(fields?: Partial<MistralAIEmbeddingsParams>) {
super(fields ?? {});
const apiKey = fields?.apiKey ?? getEnvironmentVariable("MISTRAL_API_KEY");
if (!apiKey) {
throw new Error("API key missing for MistralAI, but it is required.");
}
this.apiKey = apiKey;
this.endpoint = fields?.endpoint;
this.modelName = fields?.modelName ?? this.modelName;
this.encodingFormat = fields?.encodingFormat ?? this.encodingFormat;
this.batchSize = fields?.batchSize ?? this.batchSize;
this.stripNewLines = fields?.stripNewLines ?? this.stripNewLines;
}
/**
* Method to generate embeddings for an array of documents. Splits the
* documents into batches and makes requests to the MistralAI API to generate
* embeddings.
Are you willing to work on this?
Yes, I would.
Made a PR that should add this functionality:
n8n-io:master
← mprytoluk:mistral
opened 01:47PM - 12 Jan 24 UTC
Add endpoint parameter on Mistral Cloud API credentials, and support for endpoin… t configuration on ChatMistralAI and Embeddings Mistral Cloud nodes
## Summary
This PR makes it possible to set the endpoint in the Mistral Cloud credentials, and this should be used by the Mistral Cloud Chat Model and Embeddings Mistral Cloud nodes.
## Related tickets and issues
https://community.n8n.io/t/endpoint-option-in-mistral-cloud-nodes/35192
## Review / Merge checklist
- [ ] PR title and summary are descriptive. **Remember, the title automatically goes into the changelog. Use `(no-changelog)` otherwise.** ([conventions](https://github.com/n8n-io/n8n/blob/master/.github/pull_request_title_conventions.md))
- [ ] [Docs updated](https://github.com/n8n-io/n8n-docs) or follow-up ticket created.
- [ ] Tests included.
> A bug is not considered fixed, unless a test is added to prevent it from happening again.
> A feature is not complete without tests.
Hello, why is the MistralAI model not compatible with the"AI Agent" node?
1 Like
Jon
January 16, 2024, 9:39am
4
Hey @LucBerge ,
It could be that we just have not got around to adding that yet.
2 Likes
because they are not developpers