Adding custom Chat Models or dynamic Credential ID

Hi everyone,

Any idea how to overcome this strange limitation?

Oleg seems to have hardcoded which LLM Chat Model nodes can be used by AI Agent node, which means you cannot create a custom node for LLM and link it to “Chat Model”, because it will never pass this filter.

Re-creating the entire agent as custom seems nearly impossible to due the amount of dependencies it has or it will make the custom node package unnecessarily bloated.

The reason I was trying to make a custom Chat Model is so that I can pass a custom credential_id to use, instead of the hardcoded and baked in credential that is linked to a workflow during its creation. Having custom credential_id can help offer more dynamic usage to the same workflows by different teams/people/customers (multi-tenancy), especially around having personal tokens only.

I’d appreciate any feedback.

So far I’ve created a custom chat model and credential type, but I cannot link it to AI Agent node, I’ve started drafting a PR to make credential_ids being used as an expression as well, but not sure this is the direction n8n team wants to take in general, so wanted to have a discussion first.

n8n version: docker.n8n.io/n8nio/n8n latest
Database (default: SQLite): SQLite
n8n EXECUTIONS_PROCESS setting (default: own, main): default
Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
Operating system: MacOS 15.3.1

Thanks

1 Like

I am looking to do the exact same thing basically (get an external credential from a MySQL DB before I call a chat completion via a chat model).

I’ve went as far as creating a new n8n flow to act as the “chat model” and return an openai compatible response. I made a “fake” openai credential in n8n with the proper base URL.

I get the “Cannot read properties of undefined (reading ‘content’)” error in the main flows ai agent.

Anyone find a suitable workaround or option, etc?