OpenAI node model get a 431 error for large inputs

Describe the problem/error/question

I experiment analyzing big documents (5-10K tokens), use gpt-3.5-turbo-1106 for that. Found out when I input such amount of text, this model shows an error:
image

Anyway it still works. I checked several sizes to find the breaking level. It’s somewhere around 12,000 characters. Check out two nodes below.

What is the error message (if any)?

image

Please share your workflow

Information on your n8n setup

  • n8n version: 1.21.1
  • Database (default: SQLite): default
  • n8n EXECUTIONS_PROCESS setting (default: own, main): default
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker compose
  • Operating system: Ubuntu 22

Hey @artildo,

Thanks for reporting this one, This is something that we are aware of but have not yet fixed. Our internal reference for this issue is PAY-997 if you need it for your notes.

I have no ETA on a fix but I have just bumped the issue with the info from this post as well.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.

New version [email protected] got released which includes the GitHub PR 9384.