Describe the problem/error/question
I tried to connect my local-ai running instance with n8n running instance. both via docker and local-ai is running and working fine with my nextcloud.
What is the error message (if any)?
After setting up the credentials and trying different base URLS, I’m not getting the available models listed to select (screenshot 1)
Its working local with Ollama via Docker on my local rig w/o any issues but it is Ollama and not local-ai.io
To proof my local-ai is working, I attached the working connection included list of models from my nextcloud settings.
I tried some different urls but couldn’t make it work.
Do you know where to find a doc or helping hand? Thanks
Please share your workflow
(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)
Share the output returned by the last node
Check first screenshot, models not listed.
Information on your n8n setup
- n8n version: 1.54.4,
- Database (default: SQLite): PSQL 14
- n8n EXECUTIONS_PROCESS setting (default: own, main): own
- Running n8n via (Docker): via Docker compose
- Operating system: Ubuntu 22.04.4 LTS