Error: Could Not Press LLM Output in AI Agent with SQL Integration

Describe the problem/error/question

I am currently using the AI SQL agent in n8n, where I connect to a PostgreSQL database and utilize OpenAI While sending requests for data, I receive an output in red that begins with “could not press LLM output,” followed by the desired answer. Although the answer itself appears to be correct and valuable, the execution ultimately fails with the indication that it is an error, preventing me from retrieving the output as expected.

Update - I have tried:

  • change Open AI Model output from Text to JSON. Didn’t work.
  • Tried to implement Function node for parsing after agent node, but it didn’t get any value

What is the error message (if any)?

The error message I receive is: “could not press LLM output” — although the subsequent output contains the correct response.

Please share your workflow

Share the output returned by the last node

Information on your n8n setup

  • n8n version: 1.58.2
  • Database (default: SQLite): Postgres
  • n8n EXECUTIONS_PROCESS setting (default: own, main): WEbhook
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Selfhost
  • Operating system: Linux

After experimenting a bit more, I’ve found that instead of using the AI SQL agent, I can simply pull all the fields directly from the database and send them to a regular OpenAI node. This workflow has drastically improved my response times, bringing them down to just around two seconds!

In contrast, the SQL agent was taking anywhere from seven to 20 seconds, and I was still facing that annoying output error. This new approach not only resolves the errors but also feels much more efficient overall. I highly recommend trying it out if you’re experiencing similar issues!

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.