Memory functionality missing from OpenAI Assistant node

What is the purpose of sending Telegram message directly to Merge 1? Would not that be processed by route message task?

@Jon the docs for the Open AI node state " Using memory with OpenAI assistants#

For the Message Assistant operation, you can connect a memory sub-node to preserve and retrieve chat history. The assistant uses this to maintain context across multiple messages. The connected memory sub-node is the source of truth for the assistant’s memory.

To do this, n8n uses OpenAI’s threads. n8n creates a new thread on each time the node executes, and pre-populates it with messages from the memory sub-node. After the run finishes, n8n updates the memory sub-node with the new messages, and deletes the thread from OpenAI."

However, there no sub connector on the OpenAI node to connect memory to the root, only a connector for tools?

Thanks

Also agree on being able use an old thread as we wouldn’t need to send all the previous chats each time and reduce token usage. It also comes in handy as you can move the prompt to open ai assistants and now just send the message for it to run :smiley: Be an awesome feature if we could be able to select the Thread ID on a run

This should not be treated as a feature. This is a defect… Basically we need to resend all the history because it is creating a new thread every time.

What we need is basically an option (parameter) to pass the thread id (If no thread id is passed then create a new one and return the thread id).

If this is implemented we would not need to use memory on the flow.

1 Like

This implementation would help a lot! we just need a field that preserves the thread_id, as is done in make

1 Like

Please add this