Issue connecting with Ollama locally

Describe the problem/error/question

I am unable to run connect my local Ollama models to my n8n dashboard using the suggested tricks.
I found great information in this topic (Link) and I think I am on the right track.
Still, I am not using the docker version of n8n i am using the cloud version and would like to later run locally. (my processor does not allow docker/containing, so sad.)

I am getting some reaction from http://172.17.0.1::11434, but not really manage to connect it.
My ollama is running well and I could change the HOST and ORIGIN preferences, but I found no benefit.

Information on your n8n setup

  • n8n version:
  • Database (default: SQLite):
  • **n8n EXECUTIONS_PROCESS setting (default: own, main):own
  • **Running n8n via (Docker, npm, n8n cloud, desktop app): n8n cloud
  • **Operating system: Ubuntu 22.04

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

This means that the n8n container is running on n8n’s cloud infrastructure, which does not have access to your local computer.
It might be possible to create a public cloudflare tunnel to your local ollama instance, but it might be easier to install n8n locally via npm instead if you can’t run the container.

just our of curiosity, what processor is that?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.