t way to find solutions is by using the search function at the upper right.
If your question hasn’t been asked before, please follow the template below. Skip the questions that are not relevant to you. →
Describe the problem/error/question
prompt is too long: 207607 tokens > 200000 maximum
I don’t know how exactly the AI Agent is built, but according to the documentation it has your workflow and at least some prior Chat messages as context which usually needs to be added to your prompt token amount. Do you have a lot nodes in your workflow or maybe very big code nodes or similar?
If your using the agent on a different workflow, does the message show up as well?
It happens, but sort of rare. I’ve had it happen twice with just simple workflows. It thinks for a while, tries and validate a few workflows, then just errors out. I would just close the editor (after saving any changes), then come back and have it try again. This usually does the trick for me. There’s a similar question to this: Prompt is too long: 203416 tokens > 200000 maximum - #6 by KhemOptimal
You can see the small workflow where this happened to me. The OP for that question has a small workflow as well.