O3-deep-research in n8n - is it possible?

Has anybody managed to get the latest o3-deep-research model working within n8n? I’d love to try and use it, both for web research and MCP integration but currently it doesnt look like its natively supported - has anybody found a workaround to get it running?

If anybody else is trying to do this, it is possible to do using the Http Request node as currently the AI nodes dont support the /v1/responses endpoint. It just involved passing the OpenAI API key as a Bearer authentication and then passing the o3-deep-research properties as a JSON body as shown below. Hopefully will be more natively supported in the future but it is possible now with a bit of work. As its a long running model, you’ll need to use it in background mode too and then poll the API in a loop to check the status.

{
“model”: “o3-deep-research”,
“background”: true,
“input”: [
{
“role”: “user”,
“content”: [
{
“type”: “input_text”,
“text”: “{{ $(‘When chat message received’).item.json.chatInput }}”
}
]
}
],
“text”: {},
“reasoning”: {
“summary”: “auto”
},
“tools”: [
{
“type”: “web_search_preview”,
“user_location”: {
“type”: “approximate”
},
“search_context_size”: “medium”
}
],
“store”: true
}

1 Like