AI agent not accepting data when using Google Gemini chat model

Describe the problem/error/question

Running the LLM chain with a Google Gemini model now returns a result that says:

[
  {
    "text": 
    "I need the content of the two JSON objects to provide a response. Please provide the content of the "Proposal Feedback" and the "External Source Document.""
  }
]

I am passing the json objects from previous nodes in the user prompt and also passing instructions to the llm in a system prompt.
This is happening primarily with the Extract Insights LLM node

What is the error message (if any)?

There is no error message

Please share your workflow

Share the output returned by the last node

N/A

Information on your n8n setup

  • n8n version: 1.90.2
  • Database (default: SQLite): SQLite
  • n8n EXECUTIONS_PROCESS setting (default: own, main): own
  • Running n8n via (Docker, npm, n8n cloud, desktop app): npm
  • Operating system: Mac OS Sequoia

Ola @

Passe o contexto no formato esperado conforme o erro tem pedido

{
  "Proposal Feedback": "... conteúdo ...",
  "External Source Document": "... outro conteúdo ..."
}

I hope I have helped in some way

Okay so I am passing it that way in the prompt however I still get the same error.

After playing with it and researching, I see the data being passed to the LLM shows as [object Object] rather than the actual data within those objects. Maybe I am doing it wrong. Would it be better to send the data as a string rather than an object?

Edit
I decided to use JSON.stringify while passing the data and it worked.

1 Like

Hi @ahmedfaaid

Excellent insight, because stringify forces JSON recognition
You did great!
Congrats!

Please mark the post as solved (blue box with check mark) so that this ongoing discussion does not distract others who want to find the answer to the original question. Thanks

2 Likes