MCP-atlassian Error Received tool input did not match expected schema

Describe the problem/error/question

MCP Server mcp-atlassian cannot be used and throws error (see below).
The Tool-Listing works though.

If I use the MCP-Server with other external Debug-Tools (inspector), then the MCP-Server works well. So I can query Jira-Tickets successful.

What is the error message (if any)?

{
  "errorMessage": "Bad request - please check your parameters",
  "errorDetails": {
    "rawErrorMessage": [
      "400 status code (no body)"
    ]
  },

  "n8nDetails": {
    "nodeName": "AI Agent",
    "nodeType": "@n8n/n8n-nodes-langchain.agent",
    "nodeVersion": 2,
    "time": "25.7.2025, 15:48:03",
    "n8nVersion": "1.95.3 (Self Hosted)",
    "binaryDataMode": "default",
    "stackTrace": [
      "NodeOperationError: Bad request - please check your parameters",
      "    at /usr/local/lib/node_modules/n8n/node_modules/.pnpm/@n8n+n8n-nodes-langchain@file+packages+@n8n+nodes-langchain_f35e7d377a7fe4d08dc2766706b5dbff/node_modules/@n8n/n8n-nodes-langchain/nodes/agents/Agent/agents/ToolsAgent/V2/execute.ts:127:12",
      "    at Array.forEach (<anonymous>)",
      "    at ExecuteContext.toolsAgentExecute (/usr/local/lib/node_modules/n8n/node_modules/.pnpm/@n8n+n8n-nodes-langchain@file+packages+@n8n+nodes-langchain_f35e7d377a7fe4d08dc2766706b5dbff/node_modules/@n8n/n8n-nodes-langchain/nodes/agents/Agent/agents/ToolsAgent/V2/execute.ts:116:16)",
      "    at processTicksAndRejections (node:internal/process/task_queues:95:5)",
      "    at ExecuteContext.execute (/usr/local/lib/node_modules/n8n/node_modules/.pnpm/@n8n+n8n-nodes-langchain@file+packages+@n8n+nodes-langchain_f35e7d377a7fe4d08dc2766706b5dbff/node_modules/@n8n/n8n-nodes-langchain/nodes/agents/Agent/V2/AgentV2.node.ts:167:10)",
      "    at WorkflowExecute.runNode (/usr/local/lib/node_modules/n8n/node_modules/.pnpm/n8n-core@[email protected][email protected][email protected]_/node_modules/n8n-core/src/execution-engine/workflow-execute.ts:1185:9)",
      "    at /usr/local/lib/node_modules/n8n/node_modules/.pnpm/n8n-core@[email protected][email protected][email protected]_/node_modules/n8n-core/src/execution-engine/workflow-execute.ts:1534:27",
      "    at /usr/local/lib/node_modules/n8n/node_modules/.pnpm/n8n-core@[email protected][email protected][email protected]_/node_modules/n8n-core/src/execution-engine/workflow-execute.ts:2098:11"

    ]

  }

}

Please share your workflow

Information on your n8n setup

  • n8n version: 1.95.3 (Self Hosted)
  • Database (default: SQLite): SQLite
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
  • Operating system: Ubuntu

Hey @AdaptiveThinking hope all is well, welcome to the community.

It means that the remote mcp server didn’t like the input it received from your client. Apparently, it received “no body”, and it isn’t happy with you about it.

Thank you.
The Problem is, that “the Client” is the n8n-integrated MCP-Client-Node. Which means 2 Things:

  1. I think I cannot change what it sends
  2. I cannot see what is send or returned between n8n and the Server

Is there a way to solve 2 to get to the root of the Problem.
As I said, when I use the MCP-Server without n8n, there are no “no body”-Messages.

I would probably start with updating n8n, 1.95.3 is about 20 releases behind the current version. I haven’t checked, but there is a good chance something have been changed or fixed since then.

So i updated to 1.103.2. What changed is, that the same Error shows now in the OpenAI-Model-Subnode of the Agent. If I disable the MCP-Client, the everything works.

Error is still “400 - no body”.

{
  "errorMessage": "Bad request - please check your parameters",
  "errorDescription": "400 status code (no body)",
  "errorDetails": {
    "rawErrorMessage": [
      "400 status code (no body)"
    ],
    "httpCode": "400"
  },
  "n8nDetails": {
    "nodeName": "OpenAI Model",
    "nodeType": "@n8n/n8n-nodes-langchain.lmChatOpenAi",
    "nodeVersion": 1,
    "time": "7/28/2025, 12:24:56 PM",
    "n8nVersion": "1.103.2 (Self Hosted)",
    "binaryDataMode": "default",
    "stackTrace": [
      "NodeApiError: Bad request - please check your parameters",
      "    at Object.onFailedAttempt (/usr/local/lib/node_modules/n8n/node_modules/.pnpm/@n8n+n8n-nodes-langchain@file+packages+@n8n+nodes-langchain_5dabdc5b9e72a2f1cc0986522789e88b/node_modules/@n8n/n8n-nodes-langchain/nodes/llms/n8nLlmFailedAttemptHandler.ts:26:21)",
      "    at RetryOperation._fn (/usr/local/lib/node_modules/n8n/node_modules/.pnpm/[email protected]/node_modules/p-retry/index.js:67:20)",
      "    at processTicksAndRejections (node:internal/process/task_queues:105:5)"
    ]
  }
}

Kontext: There is not OpenAI behind that node, but a compatible System.

Seems to be the same as Locally hosted LLM is not able to call tools - #4 by fwasmeier

Because I got a Mail to say that: IT has not yet solved.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.