I’m trying to use the Ollama node with a Llama3 model I’ve already pulled and served on the same server as my n8n instance, but when I try to set the model up it’s not showing any of the models:
Are you running either n8n or ollama in a docker container as it looks like you are getting an html page back rather something from the API so my first thought is you have ollama or n8n in docker and you are using localhost to try and access the instance which won’t work as that would be local to the container.