Hi @ocus
Thanks for posting here and welcome to the community!
So you basically have two tasks that your chatbot needs to do. You can use the AI agent only for the task of “getting the required information” from the user and the actual “parsing” can be done by a LLM Chain
afterwards.
Here’s what this would look like:
A few notes:
- The AI agent gets simple instructions so all it has to do is converse with the user until the required information has been obtained (You don’t have to mention
{{ $json.chatInput }}
in the system message, as this will come from the prompt parameter automatically) - The
Chat Memory Manager
helps the agent to keep track of the “iterations” with the human - This is quite a complex task to do for an AI so you probably will see differences in the models. Ollama models may not perform as strongly as an OpenAI model for example
- You could probably connect an email tool node for the agent to send the email directly, though that would not include the correct parsing, so for now you’d have to go a more linear approach and do the emailing at the end of the workflow after the
LLM Chain