How to specify to use OpenAI Tool (A Custom MCP in OpenAI) via AI Agent

Im using the very new Tool feature in OpenAI API to expose my MCP Server into the OpenAI dataset. It’s working great on the OpenAI Playground, I specify a prompt that my tool’s added to and when I execute prompts, the LLM calls my Tool and includes my dataset.

Open AI provides code examples to call Prompts via their API, using the syntax below, my issue is there does not seem to be a way with the N8N OpenAI connector to set these header values, I have also tried (a) adding it to system prompts and also (b) specifying it as part of the main Prompt, with no luck, I trace my MCP server configured in the OpenAI Environment (Cloud) and don’t see any calls to it.

Has anyone worked out how to do this. here’s the example OpenAI provides when you save the prompt (which has my MCP tool definition in it, as well as the default values like model, temp, etc) and how to specify it as part of the call.

curl https://api.openai.com/v1/responses
-H “Content-Type: application/json”
-H “Authorization: Bearer $OPENAI_API_KEY”
-d ‘{
“prompt”: {
“id”: “pmpt_hiddenID”,
“version”: “1”
}
}’

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

So, I was able to get this to work with the following

const response = await openai.responses.create({
model: ‘gpt-4.1’,
tools: [
{
“type”: “mcp”,
“server_label”: “YYY”,
“server_url”: “https://YYY/mcp”,
“allowed_tools”: [
“find_AAA”,
“find_YYY”
],
“require_approval”: “never”
}
],
input: userInput
});