How do I parse data from the LLM conversation into my GraphQL query?

Hi! How do I e.g. get a product name the user in the LLM conversation mentions and source that to my GraphQL query to get more details on that product? How do I define the variables that needs to be filled in the query?

“I am looking for more details on product X”

Hi there, i think you can use something like this

you would just need to write a good prompt to the AI agent to explain what it does, what it should look for, etc.

and on the GraphQL Tool, for the query, you can use the AI Agent to decide what to fill on its own

Hey @fahmiiireza thanks for thinking along. I have that set up, but I don’t understand yet how to ensure e.g. product_name is a variable in the GraphQL query and how to feed that field the data from the user conversation. The prompt part and calling the GraphQL API is all fine. What steps do I follow to achieve sourcing the contextual data in the right field?

first of all, since you have 2 tools for graphQL, you need to explain it in the system prompt of the AI Agent about this tools, when to call it, and upon calling it, what field needs to be given in those tools

and then, inside each of those tools you can add a very clear description on what the fields needs to be filled with, this description that i mean in this image below

have you tried it, if you havent, try it and lemme know! :raised_hands:

Interesting, I gave it descriptions and then in the test pop up it gives me an error back. Any documentation on formatting anywhere? :slight_smile:

Solved it! Its just that the execute step doesn’t work, but was able to get my query executed with dynamic data

1 Like