Prompt is too long: 203416 tokens > 200000 maximum

subj.

It appears that this is the maximum token limit 200k

If you remove 3416 tokens from your prompt, problem solved :sweat_smile:
giphy

How can I remove it if it’s the built-in N8N wizard that’s writing it?

I’ve seen quite a few n8n AI wizard errors, but this one is new. Are you able to get past this? Of course, the AI needs context to give you recommendations. So, were you working on a very long conversation with the wizard? Do you have a large workflow with many AI nodes, such as various agents, with large prompts?

Khem

Here is my process https://s01.pic4net.com/di-4VK5LN.png
*ai generates 5 quotes

OK, @grandcazino . I’ve encountered it recently. There’s not much we can do, unfortunately. I guess that’s why the AI is in beta. I just closed the editor, came back in, and had it try again. It can get into a long thinking episode and use up lots of tokens– 200,000 tokens is the limit for a good amount of LLMs.

Khem

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.