Ollama chat model options missing in summarization chain

Hello,

I’ve set up a local Ollama server with Gemma3:27b. I’m playing around with the Summarization Chain trying to summarize big documents. I’m trying out both the Ollama Model and the Ollama Chat Model nodes attached to the Summarization Chain. On both of the nodes, I’ve added a few options like temperature and context length. After every test run I open the attached model node to see its input and output. What I’ve noticed is that using the Ollama Chat Model node, the options I’ve set are not sent alongside the request, while using the Ollama Model the options get sent.

For example, this is JSON input to the Ollama Chat Model node:

[
  {
    "messages": [
      "..."
    ],
    "estimatedTokens": 11700,
    "options": {
      "lc": 1,
      "type": "not_implemented",
      "id": [
        "langchain",
        "chat_models",
        "ollama",
        "ChatOllama"
      ]
    }
  }
]

and this is the JSON input to the Ollama Model node:

[
  {
    "messages": [
      "..."
    ],
    "estimatedTokens": 11700,
    "options": {
      "base_url": "http://host.containers.internal:11434",
      "model": "gemma3:27b",
      "temperature": 0.5,
      "num_ctx": 64000
    }
  }
]

Is this intentional? Should we use the Ollama Chat Model node at all?

Can you copy and paste your workflow into a reply here to help understand the full context of your circumstance?

Thanks