Hi,
my company and I are so super happy to test out n8n in order to make an educated decision if this is our tool moving forward into the era of Agentic AI. Therefore we are trying to run our local llama 3.3 super 49b with n8n AI Agent and got a problem:
Describe the problem/error/question
I have a on-prem hosted NVD Nim container llama3.3 super:
Set everything up connected everything to an AI Agent and tried to chat. Worked like a charm.
But the next step is to let the AI Agent use Tools. Therefore i changed the type to “Tools Agent” and added a simple WebEx Tool, where a message gets sent to a Room.
THE PROBLEM: LLM can correctly identify the tool and use it, but the tool never gets called. Instead the tool call is outputted via chat:
That is also visible in the executed nodes which shows that the WebEx Tool is not used:
Please share your workflow
Share the output returned by the last node
The Ai Agent then outputs:
[{
“output”: “[{“name”: “Create_a_message_in_Webex_by_Cisco”, “arguments”: {“Text”: “Hi Test”}}]”
}]
Expected output
We would expect the local LLM to be able to use Tools from the AI Tools Agent in n8n and get back to the use with output when the tool usages are finished.
just FYI, with GPT Models (API use) for example this exact workflow is working.
Looking forward to reading your ideas to get this going,
Flo
Information on your n8n setup
- n8n version: 1.100.0
- Database (default: SQLite): default
- n8n EXECUTIONS_PROCESS setting (default: own, main): default
- Running n8n via (Docker, npm, n8n cloud, desktop app): docker-compose self-hosted
- Operating system: