Window Buffer Memory

Hello,
I created an OpeanAI assistant and uploaded documents to it on the openai website so that it would respond to them. In n8, I created the openai message assistant node and selected the assistant I had previously created on the openai website in it. I also connected Window buffer memory to this node.

Question: why, at the first question, the openai assistant takes data from documents uploaded to the assistant, and for subsequent questions provides data not from documents, but from the openai model, if Window buffer memory is disabled, it works well and provides data from downloaded documents.


It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hi @walker,

Good question! It could be that once the initial message is stored, the assistant assumes it has enough context and doesn’t re-query the documents for follow-up questions. It likely checks all the messages passed in to decide whether new context is needed, and if not, it skips re-accessing the files.

As far as I know, you’ve got to specify in the AI prompt that it must query the data everytime or X time you want otherwise it just query it once and then doesn’t do it anymore.