Describe the problem/error/question
Hello I’m new to N8n and was trying to locally run both N8n and Ollama from my Raspberry Pi 5. I was able to install ollama manually but needed to install docker for N8n to run. From there i was able to get the credentials to load using port 5678 (port 11434 wasn’t working) but now it wont load any of my models and
What is the error message (if any)?
Please share your workflow
(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.){
"nodes": [
{
"parameters": {
"modelId": {
"__rl": true,
"mode": "list",
"value": ""
},
"messages": {
"values": [
{}
]
},
"options": {}
},
"type": "@n8n/n8n-nodes-langchain.ollama",
"typeVersion": 1,
"position": [
208,
0
],
"id": "e4992bce-01c6-4ba0-8b86-c81259cf1dce",
"name": "Message a model",
"credentials": {
"ollamaApi": {
"id": "Ybo5jA5nffNvZbT3",
"name": "Ollama account"
}
}
}
],
"connections": {},
"pinData": {},
"meta": {
"templateCredsSetupCompleted": true,
"instanceId": "ef745addbf293584f98bf2e6407d65c250695aff40221083333e1214abd84ea6"
}
}
Share the output returned by the last node
Information on your n8n setup
- n8n version:
- Database (default: SQLite):
- n8n EXECUTIONS_PROCESS setting (default: own, main):
- Running n8n via (Docker, npm, n8n cloud, desktop app):
- Operating system:
