Hi everyone,
I’m experiencing a problem with my local installation of n8n (version 1.81.4) when creating an OpenAI agent using the “Message a Model” node—the only node I’ve found that returns output in JSON format. When I select an LLM from the OpenAI list, the node properly connects to the tools. However, if I use a dynamic LLM expression, the agent node loses its connection with the tools, preventing me from using any of them. I also tried the standard Agent Tools node, but it doesn’t allow “output to JSON.”
Any suggestions or workarounds would be greatly appreciated. Thanks for your help!
Thank you for your suggestion. I understand that the Tools Agent node offers a structured output by enabling the “Require Specific Output Format” option and connecting an appropriate output parser.
However, my requirement is for an automatic output, similar to what the “Message a Model” node provides, because the LLM’s responses vary each time. The preformatted structure from the Tools Agent doesn’t accommodate the dynamic nature of the LLM’s outputs in my use case.
Additionally, I prefer not to add an extra LLM-based output parser to fix the response, as it increases both workflow execution time and computational costs.