Issue: “Received tool input did not match expected schema” when calling HTTP tool and ‘Let Model Specify Entire Body’ enabled
Hi everyone,
I’m encountering an issue when using the AI Agent node in n8n with an HTTP Request tool that connects to the exa.ai /search
API endpoint.
The error message I receive is:
Problem in node ‘AI Agent’: Received tool input did not match expected schema
What I’ve confirmed:
- The HTTP Request node is configured as a
POST
request, with the correct URL: https://api.exa.ai/search
- I’m using Generic Credential (Header Auth) with a valid API key
Send Body
is enabled and set to Let Model Specify Entire Body
- The JSON input passed from the AI Agent is valid (verified with langgraph trace)
LangGraph trace confirms this JSON was properly generated and passed into the tool call, yet n8n throws a schema mismatch error internally.
My Workflow
Troubleshooting attempts so far:
- Verified the JSON structure matches the official Exa.ai API documentation

- LangGraph trace confirms the tool call input is valid JSON

My Questions:
- Does the n8n AI Agent expect a specific schema format for tool body input beyond plain JSON when setting ‘Let Model Specify Entire Body’?
- Is there a hidden requirement or structure validation that the Agent performs?
Any examples or suggestions would be greatly appreciated!
My n8n Setup
- n8n version: 1.82.1
- Database: SQLite (default)
- EXECUTIONS_PROCESS: own, main
- Deployment: Docker
- OS: Ubuntu 22.04
Thank you so much for any help or clarification! 
Could you confirm the json it generated works with a real request?
yes, this json is work with a real request
Try building the fields using the fields below with optional options and see if that helps, as it would define a schema behind the scenes:
When I set “Value Provided: By Model”, the tool actually works as expected.
I inspected the tool’s input and saw that it looked like this:
{
"query": {
"query": "history of #HalaMadrid"
}
}
However, what I expected was simply:
{
"query": "history of #HalaMadrid"
}
So I suspect that when using “Specify Body: Let Model Specify Entire Body”, the expected JSON structure might also be wrapped in a similar way.
To test this, I modified the LLM’s output format accordingly (as shown in the LangGraph trace screenshot below), but I still received the same error:
Received tool input did not match expected schema
Awesome! So that worked? I am glad!
That is odd tho.
But if my response helped, would appreciate to mark it as solution 
Hey! In my case, the problem was that the AI Assistant was sending a json object
but n8n was expecting a string
!
n8n is expecting a string by default after you select Defined automatically by the **model**
. In the background, it seems like n8n uses the $fromAI
function. The 3rd parameter of the function is the expected type for that property. In my case, that type was always expected to be string
. See the gif below. It’s changed to string
even if I set it to json
.
So, the model sends json object
and n8n is expecting string
. Therefore, I was getting Received tool input did not match expected schema
.
The solution was to remove the Defined automatically by the **model**
and use an expression to set the value of the input as {{ $fromAI('inputName', 'input description', 'json') }}
. I wasted many hours on this. I hope it helps you save some time.

3 Likes
I 'm working on a pretty similar workflow as the one shared above by @leiserson: an AI agent with an openAI gpt4.1-mini model. The workflow worked perfectly on n8n 1.88 version.
Then I updated to latest 1.90.2 version and I started getting the error. I switch to different openAI model (gpt 4.1-nano) and the error disappeared.
Hi~ thanks for sharing your experience! I tried running the workflow I shared above again, and in my current n8n version (1.90.1), the ‘Let Model Specify Entire Body’ option has already been removed from the Http tool node.
Now there are only ‘Using JSON’ and ‘Using Fields Below’ options left. So, I tried using the ‘Using JSON’ option to let the model construct the entire request body, and so far, it seems to be working properly.
Thank you for figuring this out - I wasted a fair amount of time trying to chase this error!
Could you help me with this in the flow module, how do I configure a single field called data with those parameters and the incoming data is automatically sorted by the suffix or how it would be or could you explain it to me in more detail, thank you?