Anthropic published a new spec for AI to discover available tools via their model context protocol.
Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external data sources and tools. Whether you’re building an AI-powered IDE, enhancing a chat interface, or creating custom AI workflows, MCP provides a standardized way to connect LLMs with the context they need.
Let n8n:
consume MCP servers via JSON-RCP
act as a MCP server for other workflows or other consumers
My use case:
I think it would be beneficial to add this because:
Instead of tools needing to be known at the time of building workflows, the amount of tools available to a consumer is contingent to a context provider evolving.
This means, if n8n AI nodes could leverage MCP, the efficiency and ways to solve problems with AI could progressively be enhanced.
In the future, I imagine that instead of building all the “tools” by itself, n8n will interact with “tools” by using MCP. That way it can leverage the existing works of others
Let me share another idea related to MCP server, aside from MCP servers as tools.
If a n8n workflow acts as a MCP server, we can integrate our own custom AI-powered tools with n8n. Sometimes, it would be great to implement a custom AI-powered app for internal use of companies. If we leverage n8n outside of the platform in custom AI tools, that would be awesome.
The other benefit to n8n is that this opens up the entire world of existing MCP servers to be AI plugins into n8n, without having to write any code other than the MCP client integration. Check out directories of MCP servers like https://mcp.so/ and Open-Source MCP servers | Glama