Hey community,
I’m playing with n8n for a while, ChatTrigger feature is fantastic. However, in my version (1.27) only few built-in LLMs are allowed (OpenAI, Google, Mistral). I am wondering how to implement some other new LLMs like Grok-1 to my workflow. Is there a way/tutorial to wrap other model as a ChatModel node?
It looks like your topic is missing some important information. Could you provide the following if applicable.
- n8n version:
- Database (default: SQLite):
- n8n EXECUTIONS_PROCESS setting (default: own, main):
- Running n8n via (Docker, npm, n8n cloud, desktop app):
- Operating system:
- n8n version: 1.27.3
- Database (default: SQLite): SQLite
- n8n EXECUTIONS_PROCESS setting (default: own, main): own
- Running n8n via (Docker, npm, n8n cloud, desktop app): Dokcer
- Operating system: Ubuntu
Hey @zeech,
We are adding new LLMs fairly often so it is always worth checking the release notes, if you wanted to add something that is missing though you could create a community node using the existing nodes as an example.
Thanks Jon!
I’m also looking forward other LLMs to be supported.
Since I have played with different LLMs for a while, I came out a simple solution and wonder if the community could provide a customized LLM node which is implemented as HTTP request. It means we can just trigger a LLM with API call (with base url, header, body etc., just like a HTTP node). As far as I know, we can call any LLM through this approach.
Besides, I have played Agents capabilities of n8n, I am also looking for the same HTTP capabilities for the Tool node.
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.