I want to integrate my custom chat model node with n8n. This custom chat model class supports multiple models such as OpenAI, Gemini, etc., and is hosted as a separate private REST API application. I need to connect this private application to n8n as a chat model.
Is it possible to use the standard custom node creation process in n8n to link this custom node, considering that I have seen sample PRs indicating that modifications are required in both the nodes-base module and nodes-langchain?
Yes, it’s possible to integrate your private REST API with n8n as a custom chat model node. You don’t necessarily have to modify the nodes-base or nodes-langchain core modules unless you want your node merged into the official n8n repo. For your own setup, you can:
Create a custom n8n node following the standard custom node development process.
Inside the node, make API calls to your private chat model service and handle inputs/outputs in the same JSON structure n8n expects.
If you want the node to behave like the built-in AI/LLM nodes (with full LangChain support), then yes, extending nodes-langchain may be required. But for private use, a standalone custom node is enough.
I can help you decide the best approach based on whether you want this integration for private use only or as a public contribution.