Describe the problem/error/question
I try to run a template using Ollama LLMs. but I keep having this issue.
I am new to the program and I am unsure of my options to take and how to make the proper changes.
What is the error message (if any)?
It seems that I cannot connect to the Ollama LLM back to the agent.
I can see my activity in ollama, but its interrupted later on.
OutputParserException [Error]: Failed to parse. Text: "```json
{{
"action": "Categorize References",
"action_input": {
"references": [
{"title": "MuÈller (1996)", "category": ["Distributed Artificial Intelligence"]},
{"title": "Norman and Jennings (1997)", "category": ["Multi-Agent Systems"]},
{"title": "Wooldridge (1997)", "category": ["Distributed Artificial Intelligence"]},
{"title": "Mora, et al. (1992)", "category": ["Workflow Management"]},
{"title": "Pruitt (1981)", "category": ["Negotiation and Cooperation"]}
],
"categories": [
{"name": "Distributed Artificial Intelligence"},
{"name": "Multi-Agent Systems"},
{"name": "Workflow Management"},
{"name": "Negotiation and Cooperation"},
{"name": "Agent Architecture and Ontologies"}
]
}
}}
```". Error: SyntaxError: Expected property name or '}' in JSON at position 1 (line 1 column 2)
at ChatConversationalAgentOutputParser.parse (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/langchain/dist/agents/chat_convo/outputParser.cjs:62:19)
at OutputFixingParser.parse (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/langchain/dist/output_parsers/fix.cjs:84:40)
at processTicksAndRejections (node:internal/process/task_queues:105:5)
at AgentExecutor._call (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/langchain/dist/agents/executor.cjs:432:26)
at AgentExecutor.invoke (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/langchain/dist/chains/base.cjs:58:28)
at Object.conversationalAgentExecute (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/@n8n/n8n-nodes-langchain/nodes/agents/Agent/agents/ConversationalAgent/execute.ts:105:19)
at Object.execute (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/@n8n/n8n-nodes-langchain/nodes/agents/Agent/Agent.node.ts:393:11)
at Workflow.runNode (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/n8n-workflow/src/Workflow.ts:1382:8)
at /home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/n8n-core/src/WorkflowExecute.ts:1167:27
at /home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/n8n-core/src/WorkflowExecute.ts:1887:11 {
llmOutput: undefined,
observation: undefined,
sendToLLM: false
} undefined
Warning: TT: undefined function: 21
TypeError: fetch failed
at node:internal/deps/undici/undici:13185:13
at processTicksAndRejections (node:internal/process/task_queues:105:5)
at post (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20)
at Ollama.processStreamableRequest (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)
at ChatOllama._streamResponseChunks (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/@langchain/ollama/dist/chat_models.cjs:753:24)
at ChatOllama._generate (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/@langchain/ollama/dist/chat_models.cjs:686:26)
at async Promise.allSettled (index 0)
at ChatOllama._generateUncached (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/@langchain/core/dist/language_models/chat_models.cjs:186:29)
at LLMChain._call (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/langchain/dist/chains/llm_chain.cjs:162:37)
at LLMChain.invoke (/home/ru/.npm/_npx/a8a7eec953f1f314/node_modules/langchain/dist/chains/base.cjs:58:28) {
[cause]: Error: connect ECONNREFUSED 127.0.0.1:11434
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1611:16) {
errno: -111,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
} undefined
Error: connect ECONNREFUSED 127.0.0.1:11434
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1611:16) {
errno: -111,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
} undefined
Please share your workflow
Information on your n8n setup
- n8n version: 1.62.5
- Database (default: SQLite):
- **n8n EXECUTIONS_PROCESS setting (default: own, main): **
- Running n8n via (Docker, npm, n8n cloud, desktop app):npm
- Operating system:Ubuntu