This model’s maximum context length is 128000 tokens. However, your messages resulted in 162104 tokens (161963 in the messages, 141 in the functions). Please reduce the length of the messages or functions.
‘tokens’ is referring to the amount of text the OpenAI chat model can accept. The error is basically telling you that you’re sending too much text, the OpenAI model you’re using only supports 128,000 tokens (which is roughly 96,000 words worth of text).
Whatever data you’re sending to the LLM is simply too large.
Thank you very much for your reply. Appreciate you let me know it is OpenAI Issue and not n8n.
May I know how to change the setting ? So that I will be using less token from OpenAI .
Now I am not allow to send text , for example when I send text thru telegram input my prompt , I will get the error msg , when I click the link , it will redirect me n8n page to top up Euro $26.
However I am able to use telegram to send audio msg or send image , I have no issue.