How connect custom LLM ( example: models Oracle Cloud Infrastructure - Generative AI e.g. OCI - Cohere Command R 08_2024v1.7) to the AI agent node ( model , memory, tools )
Describe the problem/error/question
I’m trying to use a node AI agent, but I’ve encountered an issue with the model selection for LLM – I can only choose from a pre-configured set of models from list. I’m wondering if there’s a way to integrate this agent with a model available in the cloud, specifically the OCI - Cohere Command R 08_2024v1.7 (on-demand) model, which is available in Oracle Cloud Infrastructure.
So far, I’ve managed to do this by making an HTTP call to OCI using Execute Command, passing input data from the “trigger chat” and receiving a response from the model. However, I can’t find a way to integrate this with the node AI agent to use this cloud model instead of the pre-set ones.
Has anyone faced a similar issue or knows how to integrate an external AI model from Oracle Cloud or other cloud provider with this type of AI agent? I’d appreciate any insights or suggestions!
What is the error message (if any)?
I cannot connect the Cohere/LLama model available in OCI Cloud to the Ai agent node in n8n
Please share your workflow
A simple AI agent that takes my chat message as input and then passes it to the HTTP module and returns a response
But I would like to connect to the node AI agent with the model available in Oracle Cloud Infrastructure, because at the moment there is a list of models that cannot be selected other than those that are available - built-in.
The Custom LLM isn’t a pre-built node in n8n, you have to simulate it using the HTTP Request node. Basically, you’re building your own model connection manually.
Thank you very much for your help, unfortunately I am new to n8n, I sell from using make.com. Could you help me with a screenshot for clarification please.