Grouping consecutive messages

Hello! Tell me, is it possible in n8n to group consecutive messages in one workflow
before sending them to the LLM model?
The problem is that clients often express one thought in several consecutive messages. And the LLM model
is forced to respond to each trigger, that is, to each message, and as a result, the client receives an illogical chain of messages.
An example of a chain of messages:

Hello!
How are you?

Can this be avoided?

Information on your n8n setup

  • **n8n version: 1.80.3 **
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own):
  • Running n8n via Ubuntu 22.04, Docker, n8n, traefik):
  • Operating system: Ubuntu 22.04Preformatted text

Yes, it is possible.

This template is an example of that:

You might have to use an extra service like Redis or make custom API calls. But it is possible.

:point_right: If my reply answers your question please remember to mark it as the solution

2 Likes

I have WhatsApp and Telegram, I’ll try. Thank you very much! )))

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.