I’m using a Basic LLM node with an attached Structured Parser. I’m getting an error but I don’t know how to identify the part that is failing the validation. The Node doesn’t show anywhere where the error is. I’m checking the output and it seems to be compliant.
Could you share your workflow or just the node “basic llm” (better with pinned datas if you can !) ? This might be related to the model and prompt, but can’t know for sure !
So to give you more info, the error I was seeing was due to a limitation in the number of tokens, so the output from the Chat was abruptly cut.
But now I’m getting this (the node is not in Error but is returning an empty parsed json).
You can see that the input to the parser is a text (with loads of newlines] and the output is empty. The validation has happened before this point and the schema is compliant. So why am I not getting a json object?