Hey! In my case, the problem was that the AI Assistant was sending a json object but n8n was expecting a string!
n8n is expecting a string by default after you select Defined automatically by the **model**. In the background, it seems like n8n uses the $fromAI function. The 3rd parameter of the function is the expected type for that property. In my case, that type was always expected to be string. See the gif below. It’s changed to string even if I set it to json.
So, the model sends json object and n8n is expecting string. Therefore, I was getting Received tool input did not match expected schema.
The solution was to remove the Defined automatically by the **model** and use an expression to set the value of the input as {{ $fromAI('inputName', 'input description', 'json') }}. I wasted many hours on this. I hope it helps you save some time.
I 'm working on a pretty similar workflow as the one shared above by @leiserson: an AI agent with an openAI gpt4.1-mini model. The workflow worked perfectly on n8n 1.88 version.
Then I updated to latest 1.90.2 version and I started getting the error. I switch to different openAI model (gpt 4.1-nano) and the error disappeared.
Hi~ thanks for sharing your experience! I tried running the workflow I shared above again, and in my current n8n version (1.90.1), the ‘Let Model Specify Entire Body’ option has already been removed from the Http tool node.
Now there are only ‘Using JSON’ and ‘Using Fields Below’ options left. So, I tried using the ‘Using JSON’ option to let the model construct the entire request body, and so far, it seems to be working properly.
Could you help me with this in the flow module, how do I configure a single field called data with those parameters and the incoming data is automatically sorted by the suffix or how it would be or could you explain it to me in more detail, thank you?