Mistral bedrock models bug : [ERROR: This model doesn't support tool use in streaming mode.]

Describe the problem/error/question

Hi,

When using AI Agent with Bedrock Chat Model (models: Mistral Large) I got an error message.

Note that it works with other Chat Model like OpenAI or Ollama.

What is the error message (if any)?

[ERROR: This model doesn’t support tool use in streaming mode.].

Please share your workflow

Information on your n8n setup

  • n8n version: 1.77.3
  • Database (default: SQLite): default
  • n8n EXECUTIONS_PROCESS setting (default: own, main): default
  • Running n8n via (Docker, npm, n8n cloud, desktop app): docker
  • Operating system: ubuntu 24

Hi, how and where do you pass the streaming parameter to i teste here please?

It seems not possible to set this parameter, which appears to be a default value

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.