How to batch messages when humans send multiple messages in chat quickly

Describe the problem/error/question

Many users chat style is to send multiple messages in quick succession. For example -

User: Hey
User: How are you going?
User: I want to find out about your product x

Rather then how they should use it such as:

User: Hey, how are you going? I want to find out about your product x

This makes the chat go on overdrive as it tries to respond to each of the messages sent in order. Can anyone suggest a way to batch messages togeather such that anything recieved within 10 seconds of each other is combined and then sent to the LLM?

I’ve tried multiple ways and keep failing.

What is the error message (if any)?

Please share your workflow

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hey @Sow_B,

That is an interesting thought, I don’t think it is possible to do this at the moment as each message triggers the workflow.

@oleg do you have any ideas for this?

I ended up finding a way by adding it to the message buffer, so I appreciate the review. Can add the solution here if needed.

1 Like