OpenAI node - How to use the new Endpoint?

Is it possible to choose the Responses API or OpenAI ?

It has been updated a few months ago by OpenAI, but the node still seems to be using the old Completions API. [https://platform.openai.com/docs/guides/migrate-to-responses\]

I tried to restrict my API key to only the Responses API, to see if the call would pass. it did not.
I also tried modifying the endpoint of the credential, I guess it is not the place to do it.

Why aren’t those major important nodes aren’t being updated on the same day of an update ?

The Gemini one too is getting old (with the use of PALM ← depreciated]

So this is a feature request to update your major nodes,.

It there an easy way to chose the endpoint on the user side ?

What’s the point to use n8n, if I need to code the API calls to get the up to date versions?

Responses benefits

The Responses API contains several benefits over Chat Completions:

  • Better performance: Using reasoning models, like GPT-5, with Responses will result in better model intelligence when compared to Chat Completions. Our internal evals reveal a 3% improvement in SWE-bench with same prompt and setup.

  • Agentic by default: The Responses API is an agentic loop, allowing the model to call multiple tools, like web_search, image_generation, file_search, code_interpreter, remote MCP servers, as well as your own custom functions, within the span of one API request.

  • Lower costs: Results in lower costs due to improved cache utilization (40% to 80% improvement when compared to Chat Completions in internal tests).

  • Stateful context: Use store: true to maintain state from turn to turn, preserving reasoning and tool context from turn-to-turn.

  • Flexible inputs: Pass a string with input or a list of messages; use instructions for system-level guidance.

  • Encrypted reasoning: Opt-out of statefulness while still benefiting from advanced reasoning.

  • Future-proof: Future-proofed for upcoming models.

Hi @Universus, if you use the AI Agent node with the OpenAI chat model attached, you can set the responses api toggle (I believe it is default).

4 Likes

Thank you @Wouter_Nigrini for your reply.

I was using the OpenAI node :

Since it is the Agent node that get’s the updates, I’ll start using it instead :ok_hand:

Will it act differently, being set as an “agent” ?

I believe it should work exactly the same as long as your stick to the basic usage of a system and user prompt.

Copy your system prompt over and test it out to compare the results, unless of course you need to make use of any of the specific special additional options available on the OpenAI node?

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.