I'm not able to use Tools on my local Deepseek

Hi,

I’ve running deepsek r1 distilled Qwen model on a server with vllm and a self-hosted n8n to use it.

On normal cases it works well but when I add some tools it doesn’t work.

I’ve enabeld vllm configuration and when I tried to send it using curl (or http) it respond with tool choice, but on an AI Agent it doesn’t work at all.

Here is a little example:

The yellow triangle says:

None of your tools were used in this run. Try giving your tools clearer names and descriptions to help the AI

Doing some research, if I use HTTP request directly to /chat/completions it returns this:

[
  {
    "id": "chatcmpl-4e5101d98a85408ea0d033ec740528ee",
    "object": "chat.completion",
    "created": 1755006486,
    "model": "deepseek-ai/DeepSeek-R1-Distill-Qwen-32B",
    "choices": [
      {
        "index": 0,
        "message": {
          "role": "assistant",
          "reasoning_content": null,
          "content": "",
          "tool_calls": [
            {
              "id": "chatcmpl-tool-d9449369ad674ff8808b99f8acb230a5",
              "type": "function",
              "function": {
                "name": "calculator",
                "arguments": "{\"expression\": \"5*8\"}"
              }
            }
          ]
        },
        "logprobs": null,
        "finish_reason": "stop",
        "stop_reason": null
      }
    ],
    "usage": {
      "prompt_tokens": 322,
      "total_tokens": 342,
      "completion_tokens": 20,
      "prompt_tokens_details": null
    },
    "prompt_logprobs": null,
    "kv_transfer_params": null
  }
]

Hey @ServiciosIT

  • Your tool is connected to the memory connector instead of Tools
  • Your tool name is Calculator1, which is something you probabaly want to change to Calculator, since this is what you told the LLM is the name of the tool.

Hi @jabbson

Thanks for your reply. It’s just a visual bug, here is another example on a new Workflow:

I am reading hugging face page for this model and it doesn’t appear to even be tool capable. Not every LLM model is able to call tools.

I’ve been running DeepSeek-R1 Distilled Qwen via vLLM and n8n—works well without tools, but the AI Agent tool integration fails. Likely due to inference server or prompt misconfigurations. Consider using DeepSeek locally via LM Studio for easier setup, full offline control, and seamless tool-based workflows.

I found a github issue where they talk about that in exactly that distilled model: [Usage]: How to use DeepSeek-R1-0528-Qwen3-8B with function call · Issue #19001 · vllm-project/vllm · GitHub

I’ve installed a smaller version of the same distilled model and same trouble….
Finally I installed Qwen3 directly (Qwen3-30B-A3B-Thinking) and it works well.

I think I previously mentioned that the first model doesn’t show tool usage as one of its capabilities.

When you changed the model, you changed it to the one that does have such capability.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.