I’m currently using xAI (Grok) as my AI agent node, and it was working fine until earlier today.
Now it’s stopped working, and I’m seeing the following error:
Error in sub-node ‘xAI Grok Chat Model‘
Argument not supported: stream_options
Has anyone else run into a similar issue?
Any idea how I can disable the stream_options parameter in n8n?
Update: I tried using the same Grok API with curl with stream turned on. It works, but it does not work on the ‘AI agent’ node. That is weird. Any idea why this is happening?
curl https://api.x.ai/v1/chat/completions -H "Content-Type: application/json" -H "Authorization: Bearer <xai-api-key>" -d '{
"messages": [
{
"role": "system",
"content": "You are a test assistant."
},
{
"role": "user",
"content": "Testing. Just say hi and hello world and nothing else."
}
],
"model": "grok-2-latest",
"stream": true,
"temperature": 0
}'
It seems like cURL test working suggests the issue is specific to n8n’s implementation rather than the xAI API itself
Hey @bartv ,
Thanks for your kind reply.
The issue started out of nowhere on the morning of 4/8 (JST), without any updates or changes to the n8n instance.
BTW, for your reference, my environment and ticket info arehere
Thanks again for your attention. @bartv
As mentioned in the thread, testing the Grok API directly using a curl command works fine when the stream option is enabled.
However, it seems that Grok may have made some changes to how their HTTP requests are handled (May have deprecated stream_options field).
It also appears that the n8n instance modifies the outgoing HTTP request to include stream_options field to xai endpoint (maybe in body payload ??) — this causes the issue. Unfortunately, this is something that n8n users currently can’t configure through the web interface.
Additional Info for Reference
To further illustrate the issue, I was able to reproduce it by sending a direct curl request with a stream_options field in the payload. As expected, the API returns the following error which is exactly the same as the error from n8n UI.
{"code":"400","error":"Argument not supported: stream_options"}
You can reproduce this using the following curl command:
curl https://api.x.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <xai-api-key>" \
-d '{
"messages": [
{
"role": "system",
"content": "You are a test assistant."
},
{
"role": "user",
"content": "Testing. Just say hi and hello world and nothing else."
}
],
"model": "grok-2-latest",
"stream_options": {"include_usage": true}, # This line triggers the issue
"temperature": 0
}'
As mentioned earlier, even though the issue is reproducible this way, I haven’t found a workaround yet—since, as far as I know, the n8n instance doesn’t allow users to modify the raw request body from either the web interface nor docker config.
Interestingly, this issue only affects the AI Agent node, not the Basic LLM Chain node. This suggests that the stream_options field is only being added by the AI Agent node.
Update:
Did a bit more digging — turns out stream_options is for the OpenAI API (docs here).
But looks like XAI doesn’t support it (maybe it was there just for legacy compatibility with OpenAI). Seems like n8n might’ve injected the config into the XAI node (Only AI agent) by mistake, which is probably why the error showed up when XAI fully dropped the field on April 8 (JST).
I’ve been having the same problem since April 8th.
I use the Open Chat Model node to authenticate with the Grok API (API key, URL https://api.x.ai/v1). Yesterday morning, some chats were not answered, and I found the error Argument not supported: stream_options in the n8n executions.
An xAI Chat Model node for the AI Agent would solve this issue. Grok is the most intelligent AI for humanized service on WhatsApp that I’ve found.
If anyone solves this error, please share the solution here so we can help each other.