Simple memory node loses context between messages

The AI agent seems to lose memory suddenly from time to time, then if I ping it several more times, asking it to think harder, then it will sometimes suddenly remember what was talked about before and continue the conversation.

Describe the problem/error/question

For example I have a prompt with the following procedures:

  1. Ask “Please enter the entry to look for.” The format is always in xx/yy. Examples are 1/2, 3/4.
  2. Check if input is in the format of xx/yy. If not, then say “Format is incorrect.” and go back to step 1. Else, store 1 = xx and 2 = yy. Say “Acknowledged 1/2 is received.”. Proceed to step 3.
  3. Use the “file lookup” tool to see if 1/2 is in the file. If not, say “I cannot find the information.” Go back to step 1. Else return the information from the file, which includes some further mandatory information user needs to provide.
  4. Validate user input. If all mandatory information collected, then proceed to another tool to create the entry.

The problem now is that, the first time I build the workflow, it would work perfectly. But afterwards, basically 9 out of 10 times, the simple memory node would suddenly jumps short and forgets everything discussed.

For example:
Human: hi
AI: Please enter the entry to look for.
Human: 123
AI: Format is incorrect. Please enter the entry to look for.
Human: 1/2
AI: Acknowledged 1 is xx and 2 is yy. We found the following mandatory information. Please fill in:
Address:
Contact Number:
Birthday:
Human: Address is 135 road, London. Contact Number: 0112345. Birthday is Jan 1, 1970.
AI: Format is incorrect. Please enter the entry to look for. ←- Here is the problem: AI would suddenly lose all memory previously and go back to step 1. I can check in the simple memory node that the chat history is gone. But sometimes if I keep inputting “you think again?” “think harder” etc., then those past conversations in the session will pop-up again and the agent proceeds normally. As to when it lost memory it varies every time. But it must lose memory somehow after the first time I test the flow.

I looked up that there could be a cache issue particularly serious with the memory node that N8N will skip the memory node if it detects that there is no change made. I am not sure how to make sure the memory node MUST be called every time, so that memory won’t be lost within the same session.

The simply memory node context window length is set at 6000 already. I think it’s high enough.

Thanks!

this is a known caching issue with the simple memory node. n8n skips execution if it detects no incoming changes — try forcing it to run every cycle by adding a timestamp or counter to the memory output. also make sure your workflow triggers the memory node regardless of upstream conditions, not just when data changes. if that doesn’t work, might be worth using a persistent storage layer (postgres or http endpoint) for critical context — much more reliable for stateful ai conversations.

Hello Benjamin, thanks a lot for your input. Sorry I am new to N8N - may I know how do we add some form of counter to the memory output…? like do i loop around the AI agent, or do I write some prompt in AI agent to ask it increase a counter by 1 every time the chat is triggered?

Thanks.

easiest fix: add an Edit Fields (Set) node right before your memory node, and create a field like ts with the expression {{ Date.now() }}. this injects a unique timestamp on every run, so n8n always sees new incoming data and can’t cache-skip the memory node. no looping needed — just that one small extra node in the chain before memory. the prompt-based counter idea would also work in theory but it’s less reliable since the AI itself might forget to increment it.

1 Like

Hello, I tried to add the timestamp retrieving from system variable in the revious node,

The issue still persists though…Do I have to specify in AI agent to retrieve the “Timestamp” variable at every chat input received, or any other setup i am missing…? Thanks!

no, the AI agent doesn’t need to reference the timestamp in its prompt at all — it’s purely a mechanism to force n8n to see “new” data each run so it won’t skip the memory node.

the important thing is that the Set node (with ts: {{ Date.now() }}) is connected directly in your main execution path right before the AI Agent node. if it’s in a branch that doesn’t feed into the agent, it won’t help.

quick way to check: look at your execution log and see if the Simple Memory sub-node shows as “executed” or “cached/skipped”. if it still shows as skipped, the timestamp isn’t reaching the right point in the flow.

if you’re still stuck after that, switching to Postgres Chat Memory is the most reliable fix — Simple Memory has known edge cases with n8n’s caching that can be hard to fully work around.

Hi @Jenniclf2 Use supabase for persistent memory and reliable retrieval of previous data.