Window Buffer Memory for AI Agent lost its chat history
I started a chat with my AI Agent, which is connected to the LLM Claude 3.5 (anthropic/claude-3.5-sonnet).
I had a converation back and forth, and took a break. I resumed my session after 3 hours, but then Claude wrote back telling me that it has no idea what we are talking about. It lost the context.
Ive never experienced this before, and expected the LLM to remember me
Question: is this a bug in my workflow, or a “feature” of Claude disregarding sessions after a certain period of time? Is there anything I can do to make the LLM remember me despite time lag?
I checked the execution log of the n8n workflow, and I can see that the Memory Window Buffer node has an empty chat history after my 3 hour break, which you can see in the second image below.
The chat (before and after the break) are using the same sessionId as input to the AI Agent node.
Thanks for the suggestion. Ive replaced the Windows Buffer Memory with Postgres. Ill post in case the error persist, but probably it was a one-time error (all software have bugs), or the context was purged because the Windows Buffer Memory is a temporery buffer as you mention.