Ollama and ToolAgent

Does the Tools Agent support Ollama? The docs don’t list it as supported. I know Ollama added Tools support and I see you can add an Ollama Chat Model to a Tools Agent and it doesn’t error, but it’s not working reliably for me. I get random and weird results.

If not are there any plans to support it soon?

  • n8n version: 1.66
  • Database (default: SQLite): Postgres
  • n8n EXECUTIONS_PROCESS setting (default: own, main): default
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
  • Operating system: Windows 11

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

I have it working but the model needs to support tools. The Lama3.2 supports them. I have tested it working with an http api call and a function that returns the current date and time.

At the moment I cant seem to get the model to send the tool any info. I have a post open for it but no answer so far.

1 Like

I’ve been using Lama 3.2 as well and have seen similar problems. Good to know its not just me having issues.

1 Like

Any word on when n8n will officially support Ollama Chat Client for Tools Agent?

I seem to have the same issue. I‘m using Ollama chat model to interact with a Qdrant DB however it seems the chat model doesn‘t pass on the users question to the Qdrant DB to search in the vector store for answers. It then returns weird answers and doesn‘t answer at all to the users questions from chat

I have the same problem, have you found a solution?

It also reads data from vector storage incorrectly.

Unfortunately no

I am also facing same issue.

Currently i am creating RAG system with Ollama and Qdrant vector store
and I am getting proper response form Retrieve Documents

But after that when it goes to primary chat Ollama chat model

its giving random response or no data found response…

but If i attach OpenAI Chat model
it works as expected.

Same issue here, tried with local n8n setup connected to local ollama running llama 3.2 model. The model does not use the tool as instructed in the system prompt.

Same prompt with OpenAI model works ok.

I’ve given up on using n8n with Ollama. It’s clear they don’t really want to spend anytime on it. I’ve had much more luck using other Agent Frameworks. My favorite right now is PydanticAI. I miss the easy Low Code of n8n, but really its not that big of a jump to go to a python framework.

Same problem and applied same solution like you…Tried with different Ollama models running n8n 1.82.1 locally, but it fails…According some video in YT, it worked in the past !!!..Hope n8n team work on this…

this is super disappointing. I really wish Ollama had more support here as there’s so much that could be done!

1 Like

Hi! I really hope that one day n8n support ToolAgent with Ollama. Meanwhile, I will give you another option to build Agents with Ollama and Tool calls… The solution is: Use “FlowiseAI”. It can be deployed locally too. It’s a product like n8n and Langflow. I tested all three and the only one that works for tool calls with Ollama is Flowise. Important: YOU MUST USE “Conversational Agent” and NOT “Tool Agent”. All my tests were with qwen2.5:7B calling tools like GetCUrrentDatetime and Composio ( I send emails throught my Gmail account in the workflow with Ollama)… Lastly, if you like coding, you also can use OpenWebUI to develop tools (or use the ones of the community) and use them in their chat (it work with Ollama also with qwen2.5:7b)…

Wait…what?!

I was investing my time in learning about n8n under the assumption this would be possible in order to have fully local AI agents…

Why would n8n not support tool calls with ollama when other solutions clearly can?

Is this really so?