Problem from new MCP tools using in the agent node [Error:422]

Problem

I’m working on a test workflow, with an AI agent and one own developed MCP server (SSE)
With MCP server, it’s fastmcp standard, that’s why the response text is like:
{
“type”: “text”,
“text”: “Material: FG126 Plant: 1310 Loc: 131A Qty: 100 PC”
}

Then this message will be automatically used in LLM step, like this:

Because of the ”“, the json format is wrong. It’s strange the flow will check this.
I think LLM should accept all texts, but here is not like this.

Does any experts meet the same issue? Maybe it’s because of the fastmcp…but it’s not working.

Thank you.

1 Like

Hello SheinRuiYang.
Hope you are well!

Let’s understand this error.

Probable cause of error 422:

It is very likely that the node you are using to send this data to the service (e.g. an HTTP Request node sending the messages array to an API) is configured incorrectly. When mapping the data from the AI ​​node output to the HTTP request body, instead of mapping the full string from the AI ​​message text field to the corresponding text field in the output request, it is mapping the internal structure (which contains arrays/sequences) to a field that expects only a string.

We can consider the following solution (using a Function node to parse the string and extract the data) may be a solution to deal with this specific scenario dictated by fastmcp.

This JavaScript code below for a Function node will extract the tool data (Material, Plant, Loc, Qty) in the specific format shown in your image:

// Access the output of the previous node, which contains the messages array
const messages = items[0].json.messages;

// Find the AI message
const aiMessage = messages.find(msg => msg.role === 'AI');

let extractedData = {};

if (aiMessage && aiMessage.text) {
  const messageText = aiMessage.text;

  // Search for the string that marks the beginning of the tool data in the specific format
  const toolTextStartIdentifier = '"text": "';
  const startIndex = messageText.indexOf(toolTextStartIdentifier);

  if (startIndex !== -1) {
    const dataStringStart = startIndex + toolTextStartIdentifier.length;
    // Search for the string that marks the end of the tool data (the closing double quote)
    const dataStringEndIdentifier = '"';
    const endIndex = messageText.indexOf(dataStringEndIdentifier, dataStringStart);

    if (endIndex !== -1) {
      // Extract the string containing the data (e.g., "Material: FG126 Plant: 1310 ...")
      const rawDataString = messageText.substring(dataStringStart, endIndex);

      // Parse the rawDataString string to extract key-value pairs
      const parts = rawDataString.split(' '); // Split by spaces
      for (const part of parts) {
        const keyValue = part.split(': '); // Split by ": "
        if (keyValue.length === 2) {
          const key = keyValue[0];
          const value = keyValue[1];
          extractedData[key] = value;
        }
      }
    } else {
        console.warn("Could not find the end delimiter for the tool data string.");
    }
  } else {
      console.warn("Could not find the start pattern for the tool data string.");
  }
} else {
    console.warn("AI message or its text field not found in the input item.");
}

// Return the extracted data as the output of the Function node
// If no data was extracted, it will return an empty object or with an error flag if preferred
return [{ json: extractedData }];

Stay safe! I hope this helps in some way.

1 Like

It’s about the new MCP client tool.


It’s very clear that the 2nd step, from MCP server I get the message. But when this message automatically go back agent and pass to LLM, the content comes together, because of the quotes, the json format is wrong. So far I don’t know how to deal with it, because it’s completely rely on AI Agent node


I simplify my example, but I cannot avoid fastmcp to output like this at the moment. It’s where I want to find the solution
{
“type”: “text”,
“text”: “{"formatted_text": "Stock Overview for Material: FG126 for Plant 1310\n------------------------------------------------------------------------------------------\nPlant |SLoc|Batch |Type | Unrestricted | Quality Insp | Blocked | Returns | Unit\n------------------------------------------------------------------------------------------\n1310 |131A| |01 | 100.000| 0.000| 0.000| 0.000|PC\n------------------------------------------------------------------------------------------\nTotal Unrestricted Stock: 100.000 PC"}”
}

I think not only my example. I just use a public MCP server from AMAP, it’s the same error.


It’s definitely the problem inside AIAGENT.

When I try to use your coding to take care of the texts, the agent doesn’t know first should run MCP client and then Code Tool, it’s somehow not working. :frowning:

This object ({ “type”: “text”, “text”: … }) is being interpreted as an array or complex structure, while the API expects messages[3] to be just a normal string. Therefore, the format sent is violating the schema expected by the API.
So, adjust the messages field to this format:

"messages": [
    "System: You are a helpful assistant.",
    "Human: 查一下FG126在工厂1310下的库存",
    "AI: Tool: Stock Overview for Material: FG126 for Plant 1310\n--------------------------------------------------------------------\nPlant │ SLoc │ Batch │ Type │ Unrestricted │ Quality Insp │ Blocked │ Returns │ Unit\n--------------------------------------------------------------------\n1310 │ 131A│ 01 │ 100.000 │ 0.000 │ 0.000 │ PC\n--------------------------------------------------------------------\nTotal Unrestricted Stock: 100.000 PC"
]

All Tool content has been transformed into a single string.
The classic JSON format content has been simplified by eliminating embedded object formatting.

Error Reasons & Generalized Solution:
API Expected Format: The DeepSeek Chat API expects the messages field to be a list of simple strings, not arrays or nested JSON. If you send something more complex, like an embedded object, it fails.

How the Problem Happened: Most likely, some node in your N8N flow processed the information by creating nested JSON. This can happen, for example, when extracting structured data from other APIs or databases and passing the result directly to the DeepSeek Chat API.

See if this solution solves your problem.

Big hug.

I’m here to help you with whatever you need.

1 Like

Hello, could you please kindly mark my previous post as the solution (blue box with check mark) so that this ongoing discussion does not distract others who want to find out the answer to the original question. Thanks

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.