Hi n8n team,
I’m building an AI-powered Telegram automation on n8n Cloud using:
Telegram Webhook → Edit Fields → OpenAI (LLM) → HTTP Request
My goal: When I send a message to my bot, OpenAI generates a reply, and it’s sent back to Telegram.
Setup:
-
Webhook Node
- URL: https://mmussto.app.n8n.cloud/webhook/telegram-input
- Status:
Working
- Telegram messages arrive correctly.
-
Edit Fields Node
- Produces JSON like:
{
“chatId”: 6039014854,
“userMessage”: “Merhaba”
}
- Produces JSON like:
-
LLM (OpenAI)
- Using basic LLM node with my OpenAI key
- Input: {{ $json[“userMessage”] }}
- Output: response is OK (no error)
-
HTTP Request Node
- Method: POST
- URL: https://api.telegram.org/bot<MY_TOKEN>/sendMessage
- Body (JSON):
{
“chat_id”: {{$json[“chatId”]}},
“text”: {{$json[“llmOutput”]}} // or raw message for testing
} - Headers: Content-Type: application/json
Problems:
- I still get:
- 400: Bad Request: message text is empty
- Or 404 Not Found
- Even when using hardcoded text like “text”: “Hello test” → still fails
- But the same API call works outside n8n
- The OpenAI LLM node gives correct output (I checked manually)
Tried:
- New Telegram token
- Rebuilding flow twice
- Both Fixed & Expression modes
- Manual test with cURL (worked)
- Setting webhook manually via BotFather (succeeded)
What I Need Help With:
- Why does the HTTP node not send the Telegram message?
- Is there a bug with the value mapping from LLM to HTTP node?
- Is there something specific I must do in n8n Cloud to pass the correct body?
Thanks for your time.
Mustafa
Describe the problem/error/question
What is the error message (if any)?
Please share your workflow
(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)
Share the output returned by the last node
Information on your n8n setup
- n8n version:
- Database (default: SQLite):
- n8n EXECUTIONS_PROCESS setting (default: own, main):
- Running n8n via (Docker, npm, n8n cloud, desktop app):
- Operating system: