/ (Mistral AI)

Hello community,

I’ve just met Groq, and I’m a bit puzzled. These 2 AIs use Mistral as the basis for their functionality.

Would it be possible to develop the nodes or, better still, leave the possibility of putting the target url of the api in the Mistral node.

Thank you for reading and have a nice day!

I support this request for two new LLM provider nodes

Thanks in advance!


You can use them already with a Cloudflare account by using the HTTP node to:

curl -X POST \
“{account-id}/ai/run/@cf/mistral/mistral-7b-instruct-v0.1” \
-H “Authorization: Bearer {api-token}” \
-H “Content-Type:application/json” \
-d '{ “prompt”: “What is grouped query attention”, “stream”: true }'

API Response: { response: “Grouped query attention is a technique used in natural language processing  (NLP) and machine learning to improve the performance of models…” }

Hey there!

I’m also using Groq, right now through an HTTP node, but I would love to have it included as a Model. It should be the same as Mixtral Cloud, but with another baseUrl. Or If you can provide the instructions to create it, I’m in.


Super interested with that also !

1 Like

Any progress with this?
Right now groq is the fastest and cheapest api service for LLM, and the righ now support Gemma, and with the recent GemmaCoder launch maibe the will also support it.

I think create a Model Node for Groq is quite easy for those who had create another… Hope it is release soon.
Thank you.
Amazing project, best people.

would you mind to share with us an example?

that’s is super awesome; but if I don’t misunderstood Perplexity add a little touch of having access to Internet which we don’t have by using directly Mistral.