How to connect custom LLM to the AI agent module

Describe the problem/error/question

I’m trying to use the AI agent module but then in the LLM input I can only get to choose from the preset set of LLMs, wondering if I can connect it using HTTP call to my own LLM and feed into the module instead?

What is the error message (if any)?

It just can’t allow non predefined LLM to feed into the module

Please share your workflow

Simple AI agent which takes my chat message as input then feed into this AI agent, instead of having the original set of LLM i’d like to connect to my own using an API call via HTTP module

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hi @simonator

Thanks for posting here and welcome to the community! :partying_face:

We have just launched our locally run AI Starter Kit that would allow you to use custom models as well (at least from the Ollama library).

At the moment the HTTP request node is not designed to be used as sub-node to define a model. However, this sounds like a pretty cool feature request - feel free to add it and give it a vote :wink:

Thanks ria for your reply and appreciate your warmth welcome :smiley:

Unfortunately ai starter kit might not be the solution to my case as i can’t run these models locally, and I’ve just submitted a feature request!

Thanks for your feedback again!

2 Likes

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.