I want to use huggingface inferencer API as Chat model in AI agents. I was trying to use OpenAI Chat Model with credentials & Base url from Hugging face (https://api-inference.huggingface.co/v1) with meta-llama/Llama-3.3-70B-Instruct but it is not working .
I tried usng OpenAI Chat Model with other providers (Hyperbolic, Deepseek) they are working fine.
What is the error message (if any)?
Not giving any reponse and after a whole request timeout
Hi @Vikas_Kumar, I tried a bit, and it seems with the hugging face inference API, you need to use a different URL, one that includes the model. I found this out by clicking the “View code” button, and then “cURL”.
It worked but it is super slow and sometimes just do not give response (I am pro plan on hugging face and work well in playground of hugging face)
Did you face any issue like that ?