Resetting conversation context in Telegram bot (n8n + OpenAI) — best approach?

Hey everyone, could you please advise on the best way to reset conversation context in a Telegram bot built with n8n?
We’re using an AI Agent with ChatGPT (OpenAI) + Memory (Window Buffer). The bot maintains a dialogue, but:
• after some time it starts to “drift”
• forgets the logic
• or наоборот keeps old context when it’s no longer needed
We’d like to implement a proper reset, for example via a “reset” command, so that:
• the entire conversation history is cleared
• a new interaction starts from scratch
Questions:
1. What’s the correct way to implement a reset? Changing the sessionId, or is there a better approach?
2. Is anyone using Window Buffer with ChatGPT in production, or is it better to switch to persistent memory right away?
3. Are there any best practices specifically for Telegram + n8n + OpenAI?
Would really appreciate any advice :folded_hands:

Describe the problem/error/question

What is the error message (if any)?

Please share your workflow

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:
1 Like

Welcome to the n8n community @alexeyshb
I’d handle reset by changing the memory session ID when the user sends /reset. That starts a fresh conversation cleanly. For production, Window Buffer is fine for short context, but if you need more reliable memory, I’d move to a persistent backend like Postgres or Redis. For Telegram, I’d keep one session per chat, keep the memory window small, and rotate the session on reset.

1 Like

Session rotation is the cleanest approach — I’d create a new sessionId when /reset is triggered. For production Telegram bots, I’d pair Window Buffer with Redis-backed memory to handle context aging automatically. The drift you’re seeing is usually context bloat — keeping your message window to maybe 10-20 exchanges keeps the model grounded.

Hi @alexeyshb Welcome!
The best approach i have used by far is just to use Supabase as a long term memory base, you can clear it whenever you want and it is more reliable than the built in ways.

When you have a simple or even complex conversation AI it is hard to maintain the conversation history i agree that is why i believe that using a persistent storage basically supabase is a far reliable and more customizable choice. (Make sure to generate a GOOD schema using any any skilled model so that you have your personal data table ready.)