I am unable to run connect my local Ollama models to my n8n dashboard using the suggested tricks.
I found great information in this topic (Link) and I think I am on the right track.
Still, I am not using the docker version of n8n i am using the cloud version and would like to later run locally. (my processor does not allow docker/containing, so sad.)
I am getting some reaction from http://172.17.0.1::11434, but not really manage to connect it.
My ollama is running well and I could change the HOST and ORIGIN preferences, but I found no benefit.
This means that the n8n container is running on n8n’s cloud infrastructure, which does not have access to your local computer.
It might be possible to create a public cloudflare tunnel to your local ollama instance, but it might be easier to install n8n locally via npm instead if you can’t run the container.