Add ChatGPT to OpenAI module

The idea is:

Today OpenAI launched a new operation called Chat, and a model called gpt-3.5.

See post here: OpenAI API

Would love to have this new operation and model available inside a node.

My use case:

This would mean more complicated prompts and AI automation tools can be generated. Instead of having an AI only answer one thing, we can now use the chat API endpoint to provide more context and therefore smarter answers.

1 Like

Oh I Sooooooooooooo need this! Beginning to hold responses within coda, and will use it as a visible memory store! Excited!!!

Work on this has started in this PR.

6 Likes

would be good for you guys to get a n8n plugin added to OpenAI where we could call our webhooks etc.

ChatGPT plugins (openai.com)

Jenny AI on Twitter: “Here it is! The world’s first demos of ChatGPT Plugins in the wild :partying_face: – screenshots & videos below :thread: Starting off, once you have access you will see new drop downs at the top for Model & Plugins – right now I have access to third party plugins https://t.co/c9dvmVxAEA” / Twitter

1 Like

New version [email protected] got released which includes the GitHub PR 5596.

3 Likes

I’m getting empty texts using ChatGPT but everything is working on davinci.

What can I look for? Can you reproduce or it’s working from your side?

Thanks :vulcan_salute:

Hey @matenauta

ChatGPT was working for me last time I checked it, what version of n8n are you running and which model are you trying?

1 Like

I tried chatgpt3.5-turbo and I can’t select it now (last version of N8N from docker, updated today).

Hey @matenauta,

Is that for Text Complete or Chat Complete? GPT should only be available for Chat Complete but the node will now show the options that your API key has access to.

You can find more information on what models can be available in the OpenAI docs here: OpenAI API

1 Like

I come to add this because I figured it out at late night.

It seems that we need to quit the screen after selecting chat or text and enter again for see the changes reflected (so we can use ChatGPT on Chat this way :sunglasses:)

Thanks!

Hey @matenauta,

It sounds like what you might have been after is the refresh option.

image

2 Likes

Hello,
is it possible to change the API endpoint? There are many “proxies” for OpenAI, and it would be nice to use them.
Thank you.

Hey @theRAGEhero,

At the moment there is no option to change the API endpoint… Can you share more information on why an OpenAI “proxy” may be useful?

1 Like

Hello @Jon,
Changing the API endpoint would allow people to use alternatives services, and not only OpenAI, with the same node.

Hey @theRAGEhero,

But what is the benefit for using alternative services that implement the same API?

I suspect that alternative services require different authentication and return results in different formats. So the right way would be to create separate node for each provider. If you want to use different ones, it’s always possible to drop several nodes on the canvas and add some routing via Switch to pick the needed service.

2 Likes

I did take a quick look and it looks like the only things I could find were proxies that let you bypass things like limits or the need to pay.

yes, exactly.

So, if someone want to use a proxy (like this one) is not possible right now.

Hello,
there are many LLM model that can be used with the OpenAI API, but they are hosted somewhere else (or they use a proxy).

Es. https://openrouter.ai/

Letting people decide their own AI endpoint would allow everyone to use the LLM they prefer (now that OpenAI doesn’t offer the premium is even more important).

I’ve seen that there is a community one, but it was not working when I tried it. (n8n-nodes-cheapai - npm)

Hey @theRAGEhero,

The problem is some of the proxies don’t implement the API in the same way which causes issue. It would be better to make a node for that service so that it will work without needing any changes that are specific for it.

1 Like