Workflow optimization

Hi Everyone,
I am looking for an idea to optimize a small workflow, I am using the AI branch of N8N
My workflow looks like this

What I am looking for is to replace the Window Buffer Memory with somehow a link to my supabase database. Basically The workflow is generating Text, but I would like the workflow to NOT generate similar content to what I have in my database already, but I want that ‘memory’ piece to rely on my database and not a short term memory solution

Any suggestions, ideas on how I can achieve this ?

Cheers,

Rad

Hi @Rad, are you possibly looking for a vector store approach here? This would be the “long term” memory you’re looking for, though it’s typically used to retrieve existing data rather than avoid it.

would like the workflow to NOT generate similar content to what I have in my database already

Perhaps our langchain expert @oleg has an idea on how to achieve this piece exactly?

Hi @Rad, one approach I could think of was to store the agent responses in the Supabase databse. Then when running the workflow, you get those responses from Supabase and pass them to the agent via system prompt. So something like this:

Here’s an example workflow.

Of course you have to keep in mind that this won’t for a lot rows with lost of tokens. In this case you’d probably want to summarize the previous content before passing it to the agent.

2 Likes

Interesting, thank you @oleg and @MutedJam I will give this a try and adjust for my needs, this looks already very very helpful. Many thanks again

3 Likes

I solved this issue by storing the text in google docs then using it in the prompt.