The Question Chain at Telegram

Hey, guys.

Trying to figure out how to build a flow in which the user initiates communication with chatbot in the telegram, the bot asks him a question (for example, what is the name of your pet) and waits for an answer in a free form.

After the user answers, the chatbot should ask the following question and so until the user passes the entire onboarding.

Is there any easy way to wait for the chatbot question to be answered without storing date like noticed here Telegram bot message chain?

Well, basically this kind of bots is a Finite-state machine.
So the key trick is to save the state somewhere between the workflow runs.

It can start from something as simple as a locally saved JSON file with the TelegramID of a user as a filename and your data stored inside.

Where do you plan to store the user answers? You can keep the current state in this place too. If you are saving answers in the SQL database, then it’s possible to add an extra variable.

Also, for such kind of bots it’s crucial to have a clear roadmap of all bot states. It can be represented as a directed graph. In a database you can create 2 tables:

  1. First table contains all bot users and their current states (one record per user)
  2. Second table containst the matrix of all states: they can be stored in 3 columns: Step, StepText, NextStep. So whenever a user reaches a certain Step, Telegram bot sends the a StepText message and assigns a NextStep as a user status. This means when a user replies again, it will have a different state, so you can catch it the workflow.

Not so easy to implement, but it’s quite powerful once you make all the preparatory actions.

Ohh, man…

I was afraid you’d say something like that. It’s clear how to make the state machine, I just was hoping that there is a simpler solution for the problem of “waiting for a response from the user”. Maybe I was a little spoiled by the implementation of flowxo, where it can be done in 1 click with universal blocks for any channel

Are you sure you’re not having smth simillar?

I believe most bot builders have something like FSM under the hood.

Currently, n8n LangChain beta version has similar mechanism for site widget with chatbot. Meaning it can wait for user input and pass the whole conversation to the AI Agent. But AFAIK it doesn’t save states, only messages. And it works only as a website widget.

I remember @jan wrote a long time ago that he didn’t anticipate n8n to be used for making Telegram bots, but apparently more and more users fall into these use-cases.

I built several bots which can save the state and the only way of doing this was an external DB with a set of predefined table. It took me a few days to setup everything, but once you are done you can clone this DB for new bots and make the necessary adjustments.

@Ed_P that is correct.

Did you check out the Wait-Node with the option to wait for a webhook call? That should make it simpler as like that you do not have to save the state externally as all the data in the workflow is still available.

It would be great to confine to such a simple and convenient solution!

The only thing I haven’t yet figure out is how to send $execution. resumeUrl in advance so that it does not confuse the user.

Hi @jan , the wait node’s webhook cannot be used externally dynamically, as only a single webhook can be configured in many situations. For example setting something on Twilio or other bot providers.

The current wait node expects us to change that to trigger if we want to continue execution from a specific step, but that cannot be done on the fly, so the bot scenarios cannot work, any ideas on this? thanks!

Welcome to the community @Karan_Verma !

In that case you could you build your own routing. Always call the same URL but communicate back the execution ID to build the Wait-URL dynamically (after all, structure is simple). You can then call it from n8n with the same data.

For sure not amazing but would work. We could then, at a later time, build in routing functionally into the core if there is a strong pull from the community.