Help needed with Http Request Node

hi All

I am setting up a workflow where users upload a file via telegram chat , which then pass the file to set node >> and then to http request node >> function code >> open AI >> then to Telegram message back

on the http request node I need to enable options field for download = true and binary = data however I don’t see that options in http request node , I need some help to fix this .

Please share your workflow

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

Share the output returned by the last node

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hey @harsh_mehta hope all is well! Welcome to the community.

When you say that you “need to enable options field for download = true and binary = data” where does this information is coming from? (Let me guess, ChatGPT?)

From the flow you explained (up to a point when things stop working for you), you have a telegram trigger, which catches messages that contain attachments, such messages are passed into the code node (why?) and then after whatever happens in the Code node, the flow goes into an HTTP Request node (why?).

Please explain and embed your workflow for us to try to help you.

hi Jabbson

Here is what I am trying to do . I am creating a workflow to accept a user query from Telegram Trigger , I need to first download this file and convert into a format where the actual file and query can then be passed now to Open AI for check the presence for user query in the file and respond back to the telegram app, problem Is at step to after the file is passed from telegram trigger app, to download file, out put of download file is garbled , its not showing the original text and hence open ai is not able to interpret , I tried multiple encoding methods but the output just stays garbled at step . Right now I am testing all of this with a simple text file . I did lot of troubleshooting

Workflow 4
{
“name”: “My workflow”,
“nodes”: [
{
“parameters”: {
“updates”: [
“message”
],
“additionalFields”: {
“download”: false
}
},
“type”: “n8n-nodes-base.telegramTrigger”,
“typeVersion”: 1.2,
“position”: [
0,
0
],
“id”: “f68ef9b6-9879-4c03-87ee-4aa1163f6a3c”,
“name”: “Telegram Trigger”,
“webhookId”: “508d9a6c-8401-476a-a8b3-419b977aa888”,
“credentials”: {
“telegramApi”: {
“id”: “vORWe2WtU8tfdR4n”,
“name”: “Telegram account”
}
}
},
{
“parameters”: {
“chatId”: “=5570155356”,
“text”: “{{$node["OpenAI1"].json["output"] || ":warning: OpenAI did not return a response. Please check if the file is valid text."}}”,
“additionalFields”: {}
},
“type”: “n8n-nodes-base.telegram”,
“typeVersion”: 1.2,
“position”: [
1260,
380
],
“id”: “446f1ac2-f7bd-4e00-9784-350c90166563”,
“name”: “Telegram1”,
“webhookId”: “f157399c-3e10-41a3-944c-20538e3fccd2”,
“credentials”: {
“telegramApi”: {
“id”: “Rtla4b5oJuuIqjIX”,
“name”: “Telegram account 2”
}
}
},
{
“parameters”: {
“jsCode”: “// Safely extract from either raw Buffer or base64\nlet fileText = "";\n\nconst binary = items[0].binary.data;\n\nif (binary.data && typeof binary.data === "string") {\n // Case: base64-encoded string\n fileText = Buffer.from(binary.data, ‘base64’).toString(‘utf-8’);\n} else {\n // Case: raw binary Buffer\n fileText = Buffer.from(binary).toString(‘utf-8’);\n}\n\n// Output to console and result\nconsole.log("Decoded fileText:", fileText);\n\nreturn [{\n json: {\n fileText,\n preview: fileText.slice(0, 100) // first 100 chars\n }\n}];”
},
“type”: “n8n-nodes-base.code”,
“typeVersion”: 2,
“position”: [
460,
360
],
“id”: “984b9c47-10b9-4001-98a9-32def4203b21”,
“name”: “Function Code”
},
{
“parameters”: {
“assignments”: {
“assignments”: [
{
“id”: “4e79c0e2-41f5-4fa6-ba01-a37117e9369b”,
“name”: “file_id”,
“value”: “={{$json["message"]["document"]["file_id"]}}”,
“type”: “string”
},
{
“id”: “e40bcba2-e794-4861-8a8b-4359ae3235b3”,
“name”: “file_name”,
“value”: “={{$json["message"]["document"]["file_name"]}}”,
“type”: “string”
}
]
},
“options”: {}
},
“type”: “n8n-nodes-base.set”,
“typeVersion”: 3.4,
“position”: [
220,
0
],
“id”: “8e92db34-0382-4e9d-bab7-8364878bf561”,
“name”: “Set Node”
},
{
“parameters”: {
“url”: “={{https://api.telegram.org/file/bot8160868043:AAG5KT-Td1LcXwAYOX9YUgUBLJRhVLf_2Ik/${$json[\"result\"][\"file_path\"]}}}”,
“options”: {
“response”: {
“responseFormat”: “file”
},
“download”: true,
“name”: “data”
}
}
},
“type”: “n8n-nodes-base.httpRequest”,
“typeVersion”: 4.2,
“position”: [
660,
0
],
“id”: “7e8daf8c-3f86-44e0-8144-bd88f9958b63”,
“name”: “HTTP Request”
},
{
“parameters”: {
“resource”: “file”,
“fileId”: “={{$node["Telegram Trigger"].json["message"]["document"]["file_id"]}}”
},
“type”: “n8n-nodes-base.telegram”,
“typeVersion”: 1.2,
“position”: [
440,
0
],
“id”: “83f4f2f5-dfdc-4fe8-bb52-ed625ea1f53e”,
“name”: “Get a file”,
“webhookId”: “ca0f2e3c-da92-4119-83e0-1bb665a8696a”,
“credentials”: {
“telegramApi”: {
“id”: “Rtla4b5oJuuIqjIX”,
“name”: “Telegram account 2”
}
}
},
{
“parameters”: {
“modelId”: {
“__rl”: true,
“value”: “gpt-3.5-turbo”,
“mode”: “list”,
“cachedResultName”: “GPT-3.5-TURBO”
},
“messages”: {
“values”: [
{
“content”: “={{Check whether the word \"vxlan\" is present in the following file:\\n\\n${$json[\"fileText\"]}}}”
}
]
},
“options”: {}
},
“type”: “@n8n/n8n-nodes-langchain.openAi”,
“typeVersion”: 1.8,
“position”: [
660,
360
],
“id”: “4d6ec14b-07e1-458f-860d-60a29db138b9”,
“name”: “Message a model”,
“credentials”: {
“openAiApi”: {
“id”: “ArdkzaC7xaIBxFjD”,
“name”: “OpenAi account”
}
}
}
],
“pinData”: {},
“connections”: {
“Telegram Trigger”: {
“main”: [
[
{
“node”: “Set Node”,
“type”: “main”,
“index”: 0
}
]
]
},
“Function Code”: {
“main”: [
[
{
“node”: “Message a model”,
“type”: “main”,
“index”: 0
}
]
]
},
“Set Node”: {
“main”: [
[
{
“node”: “Get a file”,
“type”: “main”,
“index”: 0
}
]
]
},
“HTTP Request”: {
“main”: [
[
{
“node”: “Function Code”,
“type”: “main”,
“index”: 0
}
]
]
},
“Get a file”: {
“main”: [
[
{
“node”: “HTTP Request”,
“type”: “main”,
“index”: 0
}
]
]
},
“Message a model”: {
“main”: [
[
{
“node”: “Telegram1”,
“type”: “main”,
“index”: 0
}
]
]
}
},
“active”: false,
“settings”: {
“executionOrder”: “v1”
},
“versionId”: “743e1c3e-c3ef-4b78-8ee8-3b6176158660”,
“meta”: {
“templateCredsSetupCompleted”: true,
“instanceId”: “416b0f0305782c235cfdb75ecf61044f6b3a48058a61053a7c8de3ac165a114e”
},
“id”: “ZxPUTfttUvSSTvKO”,
“tags”:
}

Workflow 3

“name”: “My workflow”,
“nodes”: [
{
“parameters”: {
“updates”: [
“message”
],
“additionalFields”: {
“download”: false
}
},
“type”: “n8n-nodes-base.telegramTrigger”,
“typeVersion”: 1.2,
“position”: [
0,
0
],
“id”: “f68ef9b6-9879-4c03-87ee-4aa1163f6a3c”,
“name”: “Telegram Trigger”,
“webhookId”: “508d9a6c-8401-476a-a8b3-419b977aa888”,
“credentials”: {
“telegramApi”: {
“id”: “vORWe2WtU8tfdR4n”,
“name”: “Telegram account”
}
}
},
{
“parameters”: {
“chatId”: “=5570155356”,
“text”: “{{$node["OpenAI1"].json["output"] || ":warning: OpenAI did not return a response. Please check if the file is valid text."}}”,
“additionalFields”: {}
},
“type”: “n8n-nodes-base.telegram”,
“typeVersion”: 1.2,
“position”: [
1260,
380
],
“id”: “446f1ac2-f7bd-4e00-9784-350c90166563”,
“name”: “Telegram1”,
“webhookId”: “f157399c-3e10-41a3-944c-20538e3fccd2”,
“credentials”: {
“telegramApi”: {
“id”: “Rtla4b5oJuuIqjIX”,
“name”: “Telegram account 2”
}
}
},
{
“parameters”: {
“jsCode”: “// Safely extract from either raw Buffer or base64\nlet fileText = "";\n\nconst binary = items[0].binary.data;\n\nif (binary.data && typeof binary.data === "string") {\n // Case: base64-encoded string\n fileText = Buffer.from(binary.data, ‘base64’).toString(‘utf-8’);\n} else {\n // Case: raw binary Buffer\n fileText = Buffer.from(binary).toString(‘utf-8’);\n}\n\n// Output to console and result\nconsole.log("Decoded fileText:", fileText);\n\nreturn [{\n json: {\n fileText,\n preview: fileText.slice(0, 100) // first 100 chars\n }\n}];”
},
“type”: “n8n-nodes-base.code”,
“typeVersion”: 2,
“position”: [
460,
360
],
“id”: “984b9c47-10b9-4001-98a9-32def4203b21”,
“name”: “Function Code”
},
{
“parameters”: {
“assignments”: {
“assignments”: [
{
“id”: “4e79c0e2-41f5-4fa6-ba01-a37117e9369b”,
“name”: “file_id”,
“value”: “={{$json["message"]["document"]["file_id"]}}”,
“type”: “string”
},
{
“id”: “e40bcba2-e794-4861-8a8b-4359ae3235b3”,
“name”: “file_name”,
“value”: “={{$json["message"]["document"]["file_name"]}}”,
“type”: “string”
}
]
},
“options”: {}
},
“type”: “n8n-nodes-base.set”,
“typeVersion”: 3.4,
“position”: [
220,
0
],
“id”: “8e92db34-0382-4e9d-bab7-8364878bf561”,
“name”: “Set Node”
},
{
“parameters”: {
“url”: “={{https://api.telegram.org/file/bot8160868043:AAG5KT-Td1LcXwAYOX9YUgUBLJRhVLf_2Ik/${$json[\"result\"][\"file_path\"]}}}”,
“options”: {
“response”: {
“response”: {
“responseFormat”: “file”
}
}
}
},
“type”: “n8n-nodes-base.httpRequest”,
“typeVersion”: 4.2,
“position”: [
660,
0
],
“id”: “7e8daf8c-3f86-44e0-8144-bd88f9958b63”,
“name”: “HTTP Request”
},
{
“parameters”: {
“resource”: “file”,
“fileId”: “={{$node["Telegram Trigger"].json["message"]["document"]["file_id"]}}”
},
“type”: “n8n-nodes-base.telegram”,
“typeVersion”: 1.2,
“position”: [
440,
0
],
“id”: “83f4f2f5-dfdc-4fe8-bb52-ed625ea1f53e”,
“name”: “Get a file”,
“webhookId”: “ca0f2e3c-da92-4119-83e0-1bb665a8696a”,
“credentials”: {
“telegramApi”: {
“id”: “Rtla4b5oJuuIqjIX”,
“name”: “Telegram account 2”
}
}
},
{
“parameters”: {
“modelId”: {
“__rl”: true,
“value”: “gpt-3.5-turbo”,
“mode”: “list”,
“cachedResultName”: “GPT-3.5-TURBO”
},
“messages”: {
“values”: [
{
“content”: “={{Check whether the word \"vxlan\" is present in the following file:\\n\\n${$json[\"fileText\"]}}}”
}
]
},
“options”: {}
},
“type”: “@n8n/n8n-nodes-langchain.openAi”,
“typeVersion”: 1.8,
“position”: [
660,
360
],
“id”: “4d6ec14b-07e1-458f-860d-60a29db138b9”,
“name”: “Message a model”,
“credentials”: {
“openAiApi”: {
“id”: “ArdkzaC7xaIBxFjD”,
“name”: “OpenAi account”
}
}
}
],
“pinData”: {},
“connections”: {
“Telegram Trigger”: {
“main”: [
[
{
“node”: “Set Node”,
“type”: “main”,
“index”: 0
}
]
]
},
“Function Code”: {
“main”: [
[
{
“node”: “Message a model”,
“type”: “main”,
“index”: 0
}
]
]
},
“Set Node”: {
“main”: [
[
{
“node”: “Get a file”,
“type”: “main”,
“index”: 0
}
]
]
},
“HTTP Request”: {
“main”: [
[
{
“node”: “Function Code”,
“type”: “main”,
“index”: 0
}
]
]
},
“Get a file”: {
“main”: [
[
{
“node”: “HTTP Request”,
“type”: “main”,
“index”: 0
}
]
]
},
“Message a model”: {
“main”: [
[
{
“node”: “Telegram1”,
“type”: “main”,
“index”: 0
}
]
]
}
},
“active”: false,
“settings”: {
“executionOrder”: “v1”
},
“versionId”: “743e1c3e-c3ef-4b78-8ee8-3b6176158660”,
“meta”: {
“templateCredsSetupCompleted”: true,
“instanceId”: “416b0f0305782c235cfdb75ecf61044f6b3a48058a61053a7c8de3ac165a114e”
},
“id”: “ZxPUTfttUvSSTvKO”,
“tags”:
}

1 Like

Hi Jason team any update on above query do u have a suggestion?

It appears the above post is marked as Solution.

If it isn’t:

  • please format your workflows appropriately with code tags.
  • include the file that is send from the telegram node and is “garbled” after downloading
1 Like

type or paste code here


  "name": "My workflow",
  "nodes": [
    {
      "parameters": {
        "updates": [
          "message"
        ],
        "additionalFields": {
          "download": false
        }
      },
      "type": "n8n-nodes-base.telegramTrigger",
      "typeVersion": 1.2,
      "position": [
        0,
        0
      ],
      "id": "f68ef9b6-9879-4c03-87ee-4aa1163f6a3c",
      "name": "Telegram Trigger",
      "webhookId": "508d9a6c-8401-476a-a8b3-419b977aa888",
      "credentials": {
        "telegramApi": {
          "id": "vORWe2WtU8tfdR4n",
          "name": "Telegram account"
        }
      }
    },
    {
      "parameters": {
        "chatId": "=5570155356",
        "text": "{{$node[\"OpenAI1\"].json[\"output\"] || \"⚠️ OpenAI did not return a response. Please check if the file is valid text.\"}}",
        "additionalFields": {}
      },
      "type": "n8n-nodes-base.telegram",
      "typeVersion": 1.2,
      "position": [
        1260,
        380
      ],
      "id": "446f1ac2-f7bd-4e00-9784-350c90166563",
      "name": "Telegram1",
      "webhookId": "f157399c-3e10-41a3-944c-20538e3fccd2",
      "credentials": {
        "telegramApi": {
          "id": "Rtla4b5oJuuIqjIX",
          "name": "Telegram account 2"
        }
      }
    },
    {
      "parameters": {
        "jsCode": "// Safely extract from either raw Buffer or base64\nlet fileText = \"\";\n\nconst binary = items[0].binary.data;\n\nif (binary.data && typeof binary.data === \"string\") {\n  // Case: base64-encoded string\n  fileText = Buffer.from(binary.data, 'base64').toString('utf-8');\n} else {\n  // Case: raw binary Buffer\n  fileText = Buffer.from(binary).toString('utf-8');\n}\n\n// Output to console and result\nconsole.log(\"Decoded fileText:\", fileText);\n\nreturn [{\n  json: {\n    fileText,\n    preview: fileText.slice(0, 100) // first 100 chars\n  }\n}];"
      },
      "type": "n8n-nodes-base.code",
      "typeVersion": 2,
      "position": [
        460,
        360
      ],
      "id": "984b9c47-10b9-4001-98a9-32def4203b21",
      "name": "Function Code"
    },
    {
      "parameters": {
        "assignments": {
          "assignments": [
            {
              "id": "4e79c0e2-41f5-4fa6-ba01-a37117e9369b",
              "name": "file_id",
              "value": "={{$json[\"message\"][\"document\"][\"file_id\"]}}",
              "type": "string"
            },
            {
              "id": "e40bcba2-e794-4861-8a8b-4359ae3235b3",
              "name": "file_name",
              "value": "={{$json[\"message\"][\"document\"][\"file_name\"]}}",
              "type": "string"
            }
          ]
        },
        "options": {}
      },
      "type": "n8n-nodes-base.set",
      "typeVersion": 3.4,
      "position": [
        220,
        0
      ],
      "id": "8e92db34-0382-4e9d-bab7-8364878bf561",
      "name": "Set Node"
    },
    {
      "parameters": {
        "url": "={{`https://api.telegram.org/file/bot8160868043:AAG5KT-Td1LcXwAYOX9YUgUBLJRhVLf_2Ik/${$json[\"result\"][\"file_path\"]}`}}",
        "options": {
          "response": {
            "response": {
              "responseFormat": "file"
            }
          }
        }
      },
      "type": "n8n-nodes-base.httpRequest",
      "typeVersion": 4.2,
      "position": [
        660,
        0
      ],
      "id": "7e8daf8c-3f86-44e0-8144-bd88f9958b63",
      "name": "HTTP Request"
    },
    {
      "parameters": {
        "resource": "file",
        "fileId": "={{$node[\"Telegram Trigger\"].json[\"message\"][\"document\"][\"file_id\"]}}"
      },
      "type": "n8n-nodes-base.telegram",
      "typeVersion": 1.2,
      "position": [
        440,
        0
      ],
      "id": "83f4f2f5-dfdc-4fe8-bb52-ed625ea1f53e",
      "name": "Get a file",
      "webhookId": "ca0f2e3c-da92-4119-83e0-1bb665a8696a",
      "credentials": {
        "telegramApi": {
          "id": "Rtla4b5oJuuIqjIX",
          "name": "Telegram account 2"
        }
      }
    },
    {
      "parameters": {
        "modelId": {
          "__rl": true,
          "value": "gpt-3.5-turbo",
          "mode": "list",
          "cachedResultName": "GPT-3.5-TURBO"
        },
        "messages": {
          "values": [
            {
              "content": "={{`Check whether the word \"vxlan\" is present in the following file:\\n\\n${$json[\"fileText\"]}`}}"
            }
          ]
        },
        "options": {}
      },
      "type": "@n8n/n8n-nodes-langchain.openAi",
      "typeVersion": 1.8,
      "position": [
        660,
        360
      ],
      "id": "4d6ec14b-07e1-458f-860d-60a29db138b9",
      "name": "Message a model",
      "credentials": {
        "openAiApi": {
          "id": "ArdkzaC7xaIBxFjD",
          "name": "OpenAi account"
        }
      }
    }
  ],
  "pinData": {},
  "connections": {
    "Telegram Trigger": {
      "main": [
        [
          {
            "node": "Set Node",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Function Code": {
      "main": [
        [
          {
            "node": "Message a model",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Set Node": {
      "main": [
        [
          {
            "node": "Get a file",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "HTTP Request": {
      "main": [
        [
          {
            "node": "Function Code",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Get a file": {
      "main": [
        [
          {
            "node": "HTTP Request",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Message a model": {
      "main": [
        [
          {
            "node": "Telegram1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  },
  "active": false,
  "settings": {
    "executionOrder": "v1"
  },
  "versionId": "743e1c3e-c3ef-4b78-8ee8-3b6176158660",
  "meta": {
    "templateCredsSetupCompleted": true,
    "instanceId": "416b0f0305782c235cfdb75ecf61044f6b3a48058a61053a7c8de3ac165a114e"
  },
  "id": "ZxPUTfttUvSSTvKO",
  "tags": []
}
````Preformatted text`
![Screenshot 2025-07-27 at 2.03.17 PM|690x440](upload://j7yWIit0uG15iOK3j5zkQwt8BAx.png)

What about

  • include the file that is send from the telegram node and is “garbled” after downloading

Nevermind, I see how the file gets “garbled”.

This is probably what you are trying to do:

If this was helpful please mark the answer as solution.

hi jBBSON ny possible solution?

Hi, did you see the workflow in the previous post?

hi Jabbson

it did work , have a additional query if I change the file from .txt file to .pcap file which is raw packet capture what changes will be required ?

I don’t know what changes will be required. What changes do you expect to be required?

the output of function code is binary data , I am not sure what format .pcap file has to be converted to which can be readable by Open AI?

I’d assume it has to be converted to a text format…

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.