I am using the conversational agent node. I have a spreadsheet with about 15k rows of investment data. I’d like to be able to chat to the agent and ask questions about my portfolio.
I have a workflow tool appended to my workflow which gets the google sheet data. When using chatgpt as AI model, it keeps giving me an error that I’m running into rate limits of chatgpt.
Is there a better way to do this? I’m not really sure how to proceed without hitting rate limits because the data contains many characters.
If each tool is returning a lot of data, that could be causing the rate limit issue and trying to reduce the data there could help.
There’s the OpenAI Cookbook’s Python notebook that explains how to avoid rate limit errors, and an example Python script for staying under rate limits while batch processing API requests.