I’m trying to use the AI agent module but then in the LLM input I can only get to choose from the preset set of LLMs, wondering if I can connect it using HTTP call to my own LLM and feed into the module instead?
What is the error message (if any)?
It just can’t allow non predefined LLM to feed into the module
Please share your workflow
Simple AI agent which takes my chat message as input then feed into this AI agent, instead of having the original set of LLM i’d like to connect to my own using an API call via HTTP module
(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)
Thanks for posting here and welcome to the community!
We have just launched our locally run AI Starter Kit that would allow you to use custom models as well (at least from the Ollama library).
At the moment the HTTP request node is not designed to be used as sub-node to define a model. However, this sounds like a pretty cool feature request - feel free to add it and give it a vote