Updating Ollama Models via WebUI Ollama

Hi everyone!

I just installed the software and I’m exploring its features—it looks great so far! However, I ran into a little snag. It initially came with the llama 3.1 model by default, and I decided to upgrade to the 70B model through the Ollama WebUI admin panel. The panel indicates that the download was successful, but it’s not appearing in my n8n workflow; I can only see the default model. Also, I didn’t install the Ollama for Windows, since I assumed it would run inside Docker. Any tips or advice would be greatly appreciated!

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

I got it, used separate webui docker. Thank you all!

2 Likes

Hi @Ablaka_Team

Thanks for posting here and welcome to the community! :partying_face:

Glad you figured it out!
We might actually add a port-mapping in the docker compose file to expose the ollama port on the host so that it’s more obvious which ollama instance is being used :wink:

here is same container with webui with browser access and if needs to pull another ollama model can be done via web interface

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.