I’m using the OpenAI Chat Model node to connect to an OpenAI-compatible API (specifically, I’m working with a model that requires a custom thinking parameter). Like this:
extra_body={“thinking”: {“type”: “auto”}}
when using LangChain, this is handled very cleanly by passing the custom parameters to an extra_body argument.
My question is: What is the correct way to add this custom thinking parameter to the request body when using the n8n OpenAI Chat Model node?
Describe the problem/error/question
What is the error message (if any)?
Please share your workflow
(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)