Ollama llm node - successful running

Describe the problem/error/question

I´m running Llama2 via Ollama.ai and docker.
Does anyone have it up and running inside the lang chain integration.

localhost:11434 is accessible. Running everything on my notebook, but it doesn’t connect.


What is the error message (if any)?

Please share your workflow

(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)

## Share the output returned by the last node
<!-- If you need help with data transformations, please also share your expected output. -->

## Information on your n8n setup
- **n8n version:**
- **Database (default: SQLite):**
- **n8n EXECUTIONS_PROCESS setting (default: own, main):**
- **Running n8n via (Docker, npm, n8n cloud, desktop app):**
- **Operating system:**

Hey @Kool_Baudrillard,

Are you using the correct base url for your ollama instance? Don’t forget that localhost will be local to the container not your host so if you are running n8n in docker as well the chances are your URL wouldn’t be http://localhost:11434 and could instead be if that is what the container is configured to use.

Hi, neither localhost nor 127… works.

But I’m able to curl -X Post and getting a response

Hey @Kool_Baudrillard,

If you are running both n8n and Ollama in docker containers you will need to use the docker network address for the container this will be different from your host machine uses but depending on the command you used for the ollama container you might be able to use the IP of your host and connect that way.

i added these to my ollama.service file:


origin being my n8n instance

1 Like

Ok, weird.

I´m running it on a Mac with the Ollama.ai app.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.