It would help if there was a node for:
It would help if the existing Hugging Face Inference Model node was enhanced to support function calling (tools).
This is critical for enabling the node to work seamlessly with the AI Agent node, unlocking the ability to build advanced, multi-step AI workflows and agentic automations using the vast ecosystem of open-source models available on Hugging Face.
Proposed Solution
The implementation would involve the following changes:
-
Add tools Parameter:
-
UI: Implement a “Fixed or Dynamic List” field in the node’s UI.
-
Functionality: Users can define available tools in the required JSON format, compatible with the Hugging Face API specification. This ensures full backward compatibility, as existing workflows will continue to function without change if the field is empty.
-
Example Tool Definition:
code JSON
downloadcontent_copy
expand_less
{ "tools": [ { "type": "function", "function": { "name": "get_current_weather", "description": "Get the current weather for a specific location", "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "The city and state, e.g., San Francisco, CA" } }, "required": ["location"] } } } ] }
-
-
Add tool_choice Parameter:
-
UI: Add a string or options field under “Advanced Options.”
-
Functionality: Allow users to control the model’s tool usage, with options like auto (default), none, required, or forcing a specific function.
-
-
Handle API Response:
-
Update the node’s execution logic to parse the API response for a tool_calls object.
-
If tool_calls is present, format it as structured JSON in the node’s output, making it easily consumable by subsequent nodes.
-
Example tool_calls Output:
code JSON
downloadcontent_copy
expand_less
{ "tool_calls": [ { "id": "call_abc123", "type": "function", "function": { "name": "get_current_weather", "arguments": "{\"location\": \"San Francisco\"}" } } ] }
-
My use case:
Currently, the “Hugging Face Inference Model” node cannot be used as a “Chat Model” within the “AI Agent” node because it lacks support for tools. This is a significant limitation that prevents users from building multi-agent workflows with powerful and cost-effective open-source models and creating sophisticated automations that require interaction with external APIs.
This forces users into complex workarounds, like using the “HTTP Request” node. Direct integration is a far more robust and scalable solution for the entire community.
Benefits
-
For Users: Seamlessly build powerful AI agents using open-source Hugging Face models and create cost-effective multi-agent automations.
-
For n8n: Strengthens n8n’s position as a leading platform for open-source AI automation and attracts developers seeking flexible, vendor-neutral AI solutions.
-
For the Ecosystem: Promotes the adoption and practical application of open-source AI.
Any resources to support this?
Yes, there is clear demand from the community and official documentation for the API feature.
Community Demand:
Technical References:
- Hugging Face Function Calling Guide: https://huggingface.co/docs/inference-providers/en/guides/function-calling
Are you willing to work on this?
Yes. I have mapped out a detailed implementation and contribution plan and am willing to contribute a Pull Request for this feature. I am opening this topic to get feedback from the maintainers and the community on the proposed approach before starting development.
To ensure a high-quality contribution, I have also considered the following testing plan:
-
Backward Compatibility: Existing workflows that do not use the tools parameter must function without any changes.
-
Edge Cases: The node should gracefully handle cases where a model does not support function calling.
-
Multi-Model Testing: The implementation will be tested against several models known to support function calling (e.g., Llama-3.1, Gemma-2, etc.).