Is it possible to speed up response time of n8n agent bot

I build a chatbot workflow in n8n. I have a chat message node, an agent AI node, a model and a tool node. The tool node is only to load a file from aws server. The query will be answered based on the file it loads. I have set response with stream. But still I can see that the bot response time is very long. The file is not very big, but take me long time get text stream output. What is the reason that the response time is so long?

@minshi-veyt Your chatbot is slow because for every question, it waits to download a file from the internet, forces the AI to read the entire file, and then starts answering. This takes time.