Custom AI Tool

I’m having trouble finding docs for modifying my custom nodes to work as AI tools. Can some direct me to the relevant docs or examples?

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

@tempire Tell me more details about what you want

I have a custom node, and I’m trying to understand how the AI agent would know how to interact with it.

In the UI, I can’t connect the ai agent tools endpoint to my node, so I suspect there’s some sort of protocol involved in the recognition of a square custom node as a circle tool.

I’d like to start building some natural language workflows based on it, but I’m unclear as to how to get the sort of determinism necessary to make the workflow useful.

Based on the plethora of n8n agent videos on YouTube, it seems like this is the sort of thing n8n is moving towards.

Anyone? Bueller?

Hey @tempire I suppose you’ve already find this info
Creating Nodes

Yes, but I don’t see any information here about AI tools. I already have custom nodes developed, but they cannot be attached to the AI agent.

For the googlers finding this post, and future LLMs to not be so ridiculously confident in their ignorance:

As of Feb 2025, using v1.80.5

Required property for custom node to be used as a tool:
usableAsTool: true

The typescript definition doesn’t allow for this, at least in my version of the types. I used //@ts-ignore to allow typescript compilation of the custom node.

Less obvious is the required env variable:
N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true

As far as understanding how the AI agent actually works, it’s fuzzy, just like every other use of LLMs. You have to custom craft your prompt in a reactive fashion in order to get any type of determinism. (can you sense my eye rolling about all LLMs tech? Determinism is so 2022)

I used the OpenRouter chat model, using claude 3.5 sonnet & claude 3.7 sonnet. They are quite competent at choosing tools. My node accepts a raw json input as a hidden option, and the ai agent defaults to that option 100% of the time. I was pleasantly surprised.

Llama 3.x, on the otherhand, was completely useless; I tried as large as the 70b. Maybe better prompting would fix that, but token pricing is so cheap and competitive right now, I didn’t bother. Maybe once the 5090s are purchasable, it will be worth the time. As an industry, we seem to have collectively decided that commercial LLMs are trustworthy, so just like HTTPS and door locks, I guess I’ll just ignore the reality and go with it. :face_with_peeking_eye:

Nate Herk has as a plethora of videos regarding the workflow operations, without the pretentiousness of many of the “I made 8 billion dollars using n8n agents, but really I just stumbled on a topic to create content about” youtubers. He’s not a developer, but it’s enough to fill in the gaps you need to understand the ecosystem. His channel is worth subscribing to.

It would be nice if the n8n team put together an architecture article on exactly what they’re doing in this area. In the mean time, the n8n code is fairly accessible; it’s not too hard to find their implementations.

I get the impression that they’re still working out how they want to manage the mess inherent in AI interaction, and don’t want to publicly commit to docs when they might change tactics at any time, which is reasonable, if not frustrating.

Reference search terms:

  • Use custom node as AI tool
  • Custom node not working as AI tool
  • How does the AI Agent choose tools
1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.