OpenAI output ends because of length. How to fix (continue & merge text)?

Describe the problem/error/question

Hi all.

I’m creating a blog posts with OPENAI node and so far so good.
What I’m trying to do is to write a big blog as possible.
Now the issue is that when i run the OPENAI, it generates only half of the article, because of the length of the output of the article. And the tokens are increased to max.

How do I tell the OPENAI to continue to write output (if there is any), and then merge together the text together? Any thoughts?

Thanks for a fantastic community!

@giulioandreini do you maybe have any input on this one? :slight_smile:

Hi @Mec ,
this sounds like a problem related to Open AI rather than n8n itself.

A couple of ideas you could investigate:

  1. Investigate if using the ChatGPT operation is possible to send the command "continue" to have it write the second part of the article. This usually works in the ChatGPT interface, but I’m not sure it works with the APIs (I don’t know if it has a memory of what it wrote in the previous message). You can probably find some discussion about it in the OpenAI community.
  2. You could start by asking to produce the structure of the article and define the title of each section. You could set a fixed number of sections (e.g. 5). Then you could iterate on each section and ask to write the text of that section (providing a prompt with a context of the article and the title of the section). At the end, you could assemble all the various sections to have the complete article.

Just a couple of ideas, hope it helps :slight_smile:

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.