This model’s maximum context length is 128000 tokens. However, your messages resulted in 141942 tokens (141902 in the messages, 40 in the functions). Please reduce the length of the messages or functions.
is filter can solve the problem
This model’s maximum context length is 128000 tokens. However, your messages resulted in 141942 tokens (141902 in the messages, 40 in the functions). Please reduce the length of the messages or functions.
is filter can solve the problem
Hey @Ali_115 hope all is good. Welcome to the community.
Filter would most definitely help here if you can significantly reduce the number of rows returned.
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.