Feature Request: Thread Limitation for User Conversation Isolation in OpenAI Assistant Agent (Lang Chain)
Description
I propose the addition of a crucial feature that allows users to limit threads in the OpenAI Assistant Agent (Lang Chain). Currently, it appears that only one thread is active, and there is no way to change threads or initiate separate sessions. This feature is essential to isolate and manage individual user conversations effectively.
Use Case
Thread Limitation for User Conversation Isolation
Users should have the ability to limit threads, creating a clear separation between each conversation or user session. This would ensure that interactions with one user do not impact or mix with interactions from another user, enhancing privacy and conversation organization.
Benefits
User Privacy: Thread limitation guarantees that each user’s conversation remains private and isolated from others.
Efficient Management: Users and developers can more efficiently manage and track individual conversations by limiting threads.
Clarity and Structure: Thread separation provides a clear and structured way to organize and archive interactions with different users.
Implementation
The feature could be implemented through the user interface controls that allow users to initiate and manage separate threads or sessions with an id. This control could ensure that interactions within one thread do not influence or overlap with interactions in another.
Conclusion
Adding the ability to limit threads for user conversation isolation is a vital enhancement for the OpenAI Assistant Agent (Lang Chain). This feature aligns with the need for privacy, organization, and efficient management of multiple user interactions.
with the last update we are able to use assistant ID but someone forgot about thread ID.
i’ve checked the code a bit and it should not be that hard to add this to the flow, the hard part is to integrate this and keep it with the following updates of N8N if someone wants to update the code for themselves.
i have created my own flow to use the assitant correctly with a thread ID but things become complicated as instead of using one node i have to use many many others.
We’ve been waitting for this for many months, without it the assistant node is pretty much useless.
I think this implementation wouldn’t be difficult, because if there was even a text field to enter the thread_id it would already solve the problem. Storing and retrieving the thread_id is not difficult outside the Assistant node, we can solve this externally, but having the thread_id input is essential.
Any news around this topic? I really would love an easy way to manage mi assistants thread_id so i can manage multiple chats through the same workflow and the http requests nodes aren’t as efficient managing it as I would like. Those someone know a nice way to manage them?
I’m not experienced with node but i’ve been through the code and seems a super easy fix, we just need to modify the options in the node to show the ThreadID field, and then grab this string and send it to the request instead of sending the one that the code is creating. If someone knows how to proper work with node i can explain the changes that i think that are needed.
Looking how OpenAI API works, we have to manually manage threads, create messages to this thread, create a and poll a run and finally get run output. If you use functions with your assistant, you have to manually handle tool_calls when polling OpenAI API.
It’s doable but… not that easy and OpenAI SDK is needed. So if you use cloud you’ll need to implement with bare hands using http requests.
I’ve modified the source code of the node and now I can use ThreadID on the OpenAi message assistant node. it works perfectly for me, I’ll try to create a push request to the main branch and make it available for everyone. I’ll keep you updated!