Hello, I tried connecting a local LLM to N8n this week, but unfortunately it didn’t work as expected. I attempted to integrate it using both Ollama and as an OpenAI endpoint, yet I consistently encountered issues with function calling. Either it couldn’t call any function at all, or it always only called one function and never a second one—even when I explicitly specified it in the prompt. I spent the most time trying to get the new Mistral Small 3.1 running. With vllm, I couldn’t get tool calling to work at all, and with an Ollama 4q version, it only called one function. I would appreciate it if someone with experience in this area could get in touch.
Information on your n8n setup
- n8n version: 1.83.2
- Database (default: SQLite):
- Running n8n via (Docker):