Data that changes with each round of the workflow - help me

I have a problem. Every time the workflow runs the data is changed and ends up getting the “undefined” error

The data that is changing by itself every interval is “chapterTitle”
“prompt”.

cant you use a set node for most of this?

Looks also like you are using the wrong keys in your code:
It should for example be item.json.chapterTitle and not item.json.chapter_title

it’s not the wrong keys, it’s because the “input” keys are changing every round of the workflow like I said above! before it was chapter_title, and then it changes by itself!

n8n does not just change things randomly. Esp. not switching user keys between camelCase to snake_case. That requires actually programmed logic.
So whatever generates that incoming data is probably doing that for some reason. So I would look there. That we are able to help you would have to share your workflow. The best, one we can run ourselves, to see the problem you are having ourselves and so debug.

Hi, Here is a part of the flow, check it out!

Hey @Edgard_Neto,

For me your OpenAI prompt is not returning a JSON format, This could be where the workflow is not complete. I have updated the prompt and the code node to set the values you are using from the Set1 node and it is working as expected… I have ran this 10 times with no issues.

1 Like

I understand! I’ll send you the complete flow then, for you to check! see below:

Did you try the example I sent to see if it works?

1 Like

Hi, so far it works! Only the large text does not appear, only the small excerpts appeared.

See the images below:

Image of the text posted on the blog:

Screenshot_3

Hey @Edgard_Neto,

That looks to be because chapterText is not in the code node to be returned it is only using the title and prompt.

1 Like

@Jon One Last Doubt, the text is generating two “Chapter”, would this be the wrong configuration in the chatgpt prompt?

and appearing conclusion at the end of all subheadings.

Prompt chatgpt:

Code 1 data:

Other chatgpt

workflow!

Hey @Edgard_Neto,

That is probably how the data is being put together or the response from GPT you would need to share the full output from the GPT node for chapter 2