How to use custom AI model with AI Agent node?

Hi everyone,

I’m developing a custom AI chat model called Cloudflare Chat Model. It’s working perfectly with nodes like Basic LLM Chain and Summarization Chain, but I’m running into issues when trying to use it with the AI Agent node.

While reviewing the source code, I found the following filter in the Agent.node.ts file:

specialInputs = [
	{
		type: 'ai_languageModel',
		filter: {
			nodes: [
				'@n8n/n8n-nodes-langchain.lmChatAnthropic',
				'@n8n/n8n-nodes-langchain.lmChatAwsBedrock',
				'@n8n/n8n-nodes-langchain.lmChatGroq',
				'@n8n/n8n-nodes-langchain.lmChatOllama',
				'@n8n/n8n-nodes-langchain.lmChatOpenAi',
				'@n8n/n8n-nodes-langchain.lmChatGoogleGemini',
				'@n8n/n8n-nodes-langchain.lmChatGoogleVertex',
				'@n8n/n8n-nodes-langchain.lmChatMistralCloud',
				'@n8n/n8n-nodes-langchain.lmChatAzureOpenAi',
				'@n8n/n8n-nodes-langchain.lmChatDeepSeek',
				'@n8n/n8n-nodes-langchain.lmChatOpenRouter',
				'@n8n/n8n-nodes-langchain.lmChatXAiGrok',
			],
		},
	},
];

It looks like the AI Agent node only accepts a predefined list of models.
My question is: how can I register or include my custom model so that it becomes compatible with the AI Agent? Is there a recommended way to extend this list or expose a custom node so that it works seamlessly?

Thanks a lot in advance!

3 Likes

I am also running into the same issue when trying to connect my custom chat model node. It would be great if we had a solution to this. Thanks.

1 Like

Interested as well as I am looking to use a custom chat model.

1 Like

Doing something similar with NVIDIA NIM, interested to know how to attach custom model to agent as well

2 Likes

Hi! Unfortunately, we have to filter the models in the Agent like this, because we need to only allow chat-completion models. The more traditional completion models will not work with the Agent node.
There is no mechanism yet to filter by capabilities, which we would need to solve this properly.

1 Like