I am getting circular error, AI model not recognizing my message

Describe the problem/error/question

Hello,

I am getting an error on LLM modules in my AI Agent for whichever model I use, Google’s Gemini or OpenAI’s ChatGPT4.1.

I am sending it a message through slack and part hardcoded and I am sure that it gets it, but it still says that it received a null. This only happens if u use a simple memory module.

It always looks like this

What is the error message (if any)?

Cannot read properties of undefined (reading ‘content’)

Please share your workflow

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

The message Cannot read properties of undefined (reading 'content') typically happens when memory node returns a null or malformed content object that the LLM model expects

1. Check the input format

Make sure that the message being passed to the AI Agent (from Slack or any other source) is a simple string or has this structure:

{
  "content": "your actual message here"
}

If the message is nested or missing the content field, the model can’t find what to read — and you’ll get the undefined error.

2. Try bypassing memory temporarily

As a test disconnect the Simple Memory node from the AI Agent and try running the Agent with a basic message
If it works without memory, the issue might be how the memory is storing or recalling the data.

Let me know! thanks

Hi,

now i disconnected memory tool and put fixed message and I still get the error

flow:

Error:

Hello,
Could you share the user and system prompt of the agent AI?

Hi you can see it in the flow, just hardcoded text, nothing more

Hello @Josip_Sare ,

forgive me I probably miss a step, but what is the trigger for this flow?
Because in the prompt it seems that a variable is missing to give context to the agent.

Slack message is the trigger, but i tried even just hardcoding prompts and it still didn’t work

I can even see that AI Agent does have some output in n8n

but it still throws this error

This happens only when Memory module is connected, but it is essential for this flow, as it needs to be conversational

Also sometimes it works great sometimes it crashes

mmm…very intersting.

when memory returns unexpected or empty structures, some LLMs (especially Gemini) throw errors like this.

Let’s try to do a test if you don’t mind and reconnect only one model and only one memory.
Please remember to ser a session ID for simple memory tool.

from your screenshot you can see what inputs and outputs exist so it would be understandable if you have more memories in the flow.
Outputs show that at least one memory returns valid chatHistory, but thereis probably more than one, and one of these is empty

Let me know, we’ll solve it!

Can you try changing the model to gemini flash 2 and see if the workflow works?