LangChain – how to pass a parameters from AI Agent to a workflow tool?

Hi everyone. I have a question regarding the workflow tool in the Langchain Chat Agent. I’m building a workflow tool that is capable of loading HTML pages for further analysis. I stumbled across certain blocks and found several tricks on how to optimize the page before sending it to LLM. Right now I don’t understand how to pass the parameters in the workflow tool. Please take a look at 2 examples.

  1. First, I’m asking a question like “What are the last blog posts on blog.n8n.io?”

In the incoming node of the workflow tool I see smth like:

[{"query": 
"https://blog.n8n.io/"}]

In this case Chat Agent correctly passes the query with the url. Not as a proper url parameter, but fine. How can I make it pass a JSON with specified keys?

  1. Second, when I ask to summarize a article, Agent sends some incorrect url, but that’s okay. It now sends it as a JSON with “url” key, but I didn’t specifically define this.
json
{"action": "HTTP_Request_Tool","action_input":
{"url": "https://blog.n8n.io/11-telegram-bots-to-transform-your-workflows-424d6eb6539c"}
}

for some reason such parameter call never reaches the workflow tool (maybe because it starts with ```json words? The firs node receives just empty value.

[{}]

I’m pretty much sure I’m missing something here, but I can’t figure out where to look for and which examples to check.

Once again, there Tool parameters are generated by GPT, it’s not a pre-defined tool parameter.

  • n8n version: 1.18
  • Database (default: SQLite): Postgres
  • n8n EXECUTIONS_PROCESS setting (default: own, main): main
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
  • Operating system: ubuntu

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hi @Ed_P, I am sorry you’re having trouble. I am not quite sure I can follow the description.

Can you perhaps share an example workflow where you end up with your empty dataset [{}]? Thank you!

Hi there,
I have a similar problem so I’ll send my own workflow as an example.
In the logs I see the model puts the user’s email as action_input when I send a test chat such as “please arrange a meeting, my email is [email protected]”, but then in the tool, the input is empty
Here is the example workflow :

Thanks for your comment, Clement.

As I see, you pass an empty value {{ }} via an email parameter.
Here’s an example of a Telegram bot with LanChain that utilises the same approach:

In the workflow values you need to pass a valid value (hard-coded string or an expression)
image

How I overcame the problem in my example: I ask GPT to pass a query string instead of JSON, in this case I can parse query parameters and extract several parameters that I need. Please take a look at the final example: AI agent that can scrape webpages | n8n workflow template
image

Hi, thanks a lot for your answer and the example, it works perfectly fine to pass multiple parameters.
My only challenge now is that if it doesn’t have the information to be sent to the tool in the query, its makes it up instead of asking the uhman as I asked it to do. If you have any suggestion about how to deal with that, I’d be very grateful.

I’d try this:

  1. Add a system prompt to ask the user if the information is missing, AND / OR
  2. Ask to put a default parameter if some information is missing.

Once the tool receives default values, it can fall back with an error i.e. “missing this and that, please ask the user to provide XYZ information”. The AI agent will receive the error message from the tool and hopefully ask the user about it :slight_smile:

Also, reducing a model temperature might help.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.