ISSUE
I’m using the Ollama Model node linked to the Information Extraction node
When I execute, I get two outputs from the ollama node.
They start with two different inputs based on “format_instructions” prompt.
For this test case, I don’t have enabled any prompt in the Information Extraction node, and the Output Format option is also not enabled in the ollama node.
The input of the first output reads:
System: You are an expert extraction algorithm.
Only extract relevant information from the text.
If you do not know the value of an attribute asked to extract, you may omit the attribute's value.
You must format your output as a JSON value that adheres to a given "JSON Schema" instance.
"JSON Schema" is a declarative language that allows you to annotate and validate JSON documents.
For example, the example "JSON Schema" instance {{"properties": {{"foo": {{"description": "a list of test words", "type": "array", "items": {{"type": "string"}}}}}}, "required": ["foo"]}}}}
would match an object with one required property, "foo". The "type" property specifies "foo" must be an "array", and the "description" property semantically describes it as "a list of test words". The items within "foo" must be strings.
Thus, the object {{"foo": ["bar", "baz"]}} is a well-formatted instance of this example "JSON Schema". The object {{"properties": {{"foo": ["bar", "baz"]}}}} is not well-formatted.
Your output will be parsed and type-checked according to the provided schema instance, so make sure all fields in your output match the schema exactly and there are no trailing commas!
Here is the JSON Schema instance your output must adhere to. Include the enclosing markdown codeblock:
```json
{"type":"object","properties":{"comercio":{"type":"string"},"fecha":{"type":"string","format":"date"},"items":...
The input of the second output of the ollama node reads the same, except for the beggining:
Instructions:
--------------
You must format your output as a JSON value that adheres to a given "JSON Schema" instance.
"JSON Schema" is a declarative language that allows you to annotate and validate JSON documents.
...
I’m not sure if I’m missing something here, the process indeed is slow since the LLM is churning tokens.
Any help is welcome.
By the way, for the ollama base url I had to change localhost with http://127.0.0.1:11434 in order for it to work.
n8n 1.75.2 local host on Win11.
ollama cli 0.5.7 / lmstudio 0.3.8-2
llama3.2:3b-instruct-q8_0