Prompt is too long: 203416 tokens > 200000 maximum

OK, @grandcazino . I’ve encountered it recently. There’s not much we can do, unfortunately. I guess that’s why the AI is in beta. I just closed the editor, came back in, and had it try again. It can get into a long thinking episode and use up lots of tokens– 200,000 tokens is the limit for a good amount of LLMs.

Khem