Enhanced OpenAI Assistant node [resolved]

Feature Request: Improvements to OpenAI Assistant Node in n8n

Overview:
This feature request proposes an enhancement for the OpenAI Assistant node within the n8n platform. The key improvement involves the ability to maintain conversation continuity and diversity by connecting messages to specific threads.

Current Node Interface:
The current interface includes assistantId parameter. However, it lacks the functionality to manage conversational threads effectively.

Requested Enhancements:

  1. Thread Management: Integrate the capability to attach incoming messages to particular threads. This feature would enable the node to maintain the flow of conversations over time, creating a more coherent interaction experience.
  2. API Reference: The implementation should be in line with the guidelines provided in the OpenAI API documentation, particularly focusing on the threads section.

Intended Benefits:
With these enhancements in place, the OpenAI Assistant node within n8n will significantly improve its capability to manage intricate, continuous conversations.

My two cents on this matter: In Zapier, for the model to continue the conversation, they use emails as a ‘Memory Key’ as well as a field to include the ‘user message’ that the model uses to respond. This allows the model to follow up on the previous conversation

Definitively needed if you want to use Assistant in a chat and be able to have a conversation on document uploaded to the assistant

New version [email protected] got released which includes the GitHub PR 9406.

1 Like

@jan Even though the N8N OpenAI Message node (Using OpenAI assistant) does output the OpenAI ThreadID, that ThreadID value is not passed into subsequent OpenAI messages, thereby creating a new OpenAI thread for every Message sent.

Change request:
The proper usage of the OpenAI Thread ID is to pass the existing thread id as a param in the API call to add a new message to an existing thread. This leverages the OpenAI thread to persist the conversation as part of the thread.

N8N window buffer memory does not seem to buffer the thread id (to my knowledge)

If there is another way to properly leverage the OpenAI thread API with N8N, please let me know.

I cant switch from Make to n8n due to miss of the “thread_id” parameter in “Message Assistant” module.

Many of my clients are asking for n8n solution with OpenAI Assistants, but i cant due to this problem.

All you need to add threadId parameter, which can be sent from Webhook or stored in any database, like i do in Make.com

3 Likes

we need this feature!!

2 Likes

Yes, please add this. It will make a huge difference. @jan

2 Likes

@maxT and @oleg - curious if you have this on your feature roadmap?

A very important and necessary feature. We’re waiting!

Hey everyone, sorry it took so long. It’s coming soon: the PR is about to be reviewed, so it should be in the next minor release(1.63). In OpenAI’s assistant message operation, you’ll have the option to switch between “Use memory connector” and “Use thread ID.”

4 Likes

Looks like this will be released in v.1.63.0 - currently in pre-release. Thanks @oleg

1 Like

This is what it looks like