Suppose I have three separate flows, one with Llama, one with Mistral, and one with DeepSeek. I like to have them ‘talk’ to each other. But it would be great if they know what the other responded.
The way I’m imagining this is they are all connected to the same memory.
It doesn’t have to be fully automated, I presume that I need to copy paste the reply from one agent, into the input of another and trigger the reply. Without the shared memory, the question I’d ask would be completely new to it every time, instead I like it to know what the other agent replied and respond to that.
Is this possible?
Cheers.
PS: I’m using n8n locally installed with npm, Ollama with Llama3, Mistral, and DeepSeek, and have AirTable.