Help starting out with OpenAI Functions Agent

Hi there!

I am trying to use OpenAI Functions Agent. There is very little documentation about how to actually implement the Agent.

Tool tip:
Utilizes OpenAI’s Function Calling feature to select the appropriate tool and arguments for execution

Documentation:

OpenAI Functions Agent parameters#

The OpenAI Functions Agent node allows you to use an [OpenAI functions model]. These are models that detect when a function should be called and respond with the inputs that should be passed to the function.

You can use this agent with the On new manual Chat Message node. Attach a memory sub-node so that users can have an ongoing conversation with multiple queries. Memory doesn’t persist between sessions.

You must use the OpenAI Chat Model with this agent.

This is really the extent of the documentation that I could find. I am basically at step 1 of this, which is just trying to figure out what the input and out of the Agent are. I have tried adding the “on new manual chat message” as a trigger, connecting that with the “openai functions agent”, assigning it a model and memory and then asking various questions in the chat input. For example: “What is the weather in California today?” Thus far, the output has been:

Action: loadMemoryVariables

AI: I’m sorry for the confusion. As an AI, I don’t have the ability to check real-time data like weather. However, you can check the current weather for Brownsville, TX by using a trusted weather website or application, such as the National Weather Service, Weather.com, or a weather app on your smartphone.

Here is a link to the documentation.

For context, my goal/hope is that I can use the “functions agent” to in some way allow my n8n workflow to interact with OpenAI plugins.

I would appreciate any guidance at all on how to get started with this function. Thank you in advance!

Hey @wintermute,

Welcome to the community :cake:

It looks like you are pretty much there, So the agent will process your prompt and use the model, memory and optional tool to work out how to answer for you. In this case I used the SerpAPI tool and followed the example given in LangChains documentation.

I don’t know if LangChain has the ability to use existing OpenAI tools / plugins but using this method we are using the Function option which you can find more information about below, Hopefully this gets you started on your journey.

OpenAI Functions: OpenAI Platform
LangChain Functions: OpenAI functions | 🦜️🔗 Langchain

2 Likes

@Jon , I really appreciate the quick response. I think this will be a very powerful combo for your community once we can figure out how to string it all together.

Your example with SerpAPI makes sense, but without seeing what is under the hood in that Tool, I don’t really know where I would start in n8n to create similar functionality. My guess is that I would replace SerpAPI in your example with Code Tool and then add some custom code, but again I don’t currently have a framework to understand what code to put there.

Regarding your point about not knowing if LangChain has the ability to use existing OpenAI tools, I followed the links that you provided (thanks!) and found this reference in the Langchain documentation, which suggest to me that they have full support for arbitrary plugins (as long as they don’t require authentication).

ChatGPT Plugins

This example shows how to use ChatGPT Plugins within LangChain abstractions.

Note 1: This currently only works for plugins with no auth.

Note 2: There are almost certainly other ways to do this, this is just a first pass. If you have better ideas, please open a PR!

They do provide a couple of examples at the link above. If you could give me a tiny bit of an additional push in the right direction for how to adapt those instructions to the Tool part of n8n Functions Agent, I think I should be able to make a fair amount of headway myself and am happy to post back further instructions here for other users.

Thanks a ton for your help! Really excited to get this working!

Hey @wintermute,

That is a good find although I am not sure how you managed to jump from the JS docs to the Python docs.

We use the JS implementation of LangChain which does also have a ChatGPT Plugins Tool: ChatGPT Plugins | 🦜️🔗 Langchain but it doesn’t look like we have implemented that yet so it won’t be possible to use the existing ChatGPT tools and instead you would need to use your own functions through either the code option or by calling other workflows.

We do have an example on how to use the Code Tool here: [AI/LangChain] Conversational Agent with custom code tool | n8n workflow template along with a few other examples which can be found in the Templates section of you n8n instance in the “Advanced AI” Collection.

I will get an internal request created to look into adding support for existing GPT tools following the guidelines in the LangChain docs.

@Jon Thanks a lot. Not sure how I ended up in the Python docs either :smiley:

I gave a shot at just adding the example code from the JS link to a Code Tool to see what would happen and got this error.

I took a look at the example you posted and mentioned templates, but did not see anything that gave me insight on how to deal with external dependencies. Any additional guidance would be much appreciated.

Thank you again for your time!

Hey @wintermute,

The code tool can be used like in the example I linked above, You probably won’t be able to use it to run GPT Plugins as the code you are trying to load there is already handled by the agent.

Currently you would need to implement your own version of what the plugins are doing in code or with other worklfows.

@Jon Thanks for the reply. Do you know of any examples/templates/instructions that I could use for guidance in how I can see “what the plugins are doing” currently, so that I can then replicate that in other workflows? Thank you!

Hey @wintermute,

I think that depends on if the plugins have their code available or not I suspect a lot of them might be closed source so there would need to be some guess work.

@Jon Got it. I misunderstood and thought you were referring to wrapping the langchain provided JS into a module for n8n. So, other than rebuilding ChatGPT plugins from scratch, is there any path that I can pursue to use them in n8n or is this a dead end?

Hey @wintermute,

At the moment as we don’t support them I can’t think of another way to do it without maybe looking at making a custom node to handle it which may be more effort than making a workflow to mimic the plugin.

This may change in the future though if we add the options needed for plugins.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.