Ollama credentials

Describe the problem/error/question

I have the same problem as in Ollama llm node - successful running.

What is the error message (if any)?

I get an error that there are no credentials for ollama, but there is no way I see to create credentials.

Please share your workflow

{
  "meta": {
    "templateCredsSetupCompleted": true,
    "instanceId": "7c910e7c72e40eaf2e824ae7f52bbfc4aac0f28f8c729f61d1c665e3d81377fb"
  },
  "nodes": [
    {
      "parameters": {
        "prompt": "={{ $json.job }}",
        "messages": {
          "messageValues": [
            {
              "message": "which jobs are best for me?"
            }
          ]
        }
      },
      "id": "f6c1675b-0008-4fac-8532-2eca2108519c",
      "name": "Basic LLM Chain",
      "type": "@n8n/n8n-nodes-langchain.chainLlm",
      "typeVersion": 1.3,
      "position": [
        1640,
        380
      ]
    },
    {
      "parameters": {
        "options": {}
      },
      "id": "63fa42ca-fd49-4519-91ab-13d143646f97",
      "name": "Ollama Model",
      "type": "@n8n/n8n-nodes-langchain.lmOllama",
      "typeVersion": 1,
      "position": [
        1640,
        600
      ]
    },
    {
      "parameters": {
        "pollTimes": {
          "item": [
            {
              "mode": "everyHour",
              "minute": 7
            }
          ]
        },
        "filters": {
          "labelIds": [
            "Label_7059639856305040568"
          ],
          "q": ""
        }
      },
      "id": "6d47a154-d501-4752-8386-bd428e09a1dc",
      "name": "Gmail Trigger",
      "type": "n8n-nodes-base.gmailTrigger",
      "typeVersion": 1,
      "position": [
        760,
        380
      ],
      "credentials": {
        "gmailOAuth2": {
          "id": "Xk1QKZtCuSfuqu1v",
        }
      }
    },
    {
      "parameters": {
        "operation": "get",
        "messageId": "={{ $json.id }}",
        "simple": false,
        "options": {}
      },
      "id": "40db2252-d663-49d3-8c91-d710f733b66b",
      "name": "Gmail",
      "type": "n8n-nodes-base.gmail",
      "typeVersion": 2.1,
      "position": [
        980,
        380
      ],
      "credentials": {
        "gmailOAuth2": {
          "id": "Xk1QKZtCuSfuqu1v",

    ],
    "Gmail": [
      {
        "id": "18d5149312709fa3",
        "threadId": "18d5149312709fa3",
        "labelIds": [
          "UNREAD",
          "IMPORTANT",
          "Label_7059639856305040568",
          "Label_4735651768623896433",
          "CATEGORY_UPDATES",
          "INBOX"
        ],
        "sizeEstimate": 105312,
        "headers": {
          KQK3WWago45zzBaOFJi6_VYNMbT-MVL5qd0wDEiAJZF0-Q67HhXI0CDeyldbxVoUNtHpplayyWT9uXfmSABoQzbofAgvD516gTpj8K5Mvkg%3D%3D&co=US&hl=en&tmtk=1hl8ki6k6gplv806&subId=64760c8e26ad8c18bcebfeb7&rgtk=1hl8ki862lrqr803&from=ja</a></p>"
      }
    ]
  }
}

Share the output returned by the last node

Information on your n8n setup

  • n8n version: 1.27.2
  • Database (default: SQLite): default
  • n8n EXECUTIONS_PROCESS setting (default: own, main): own
  • Running n8n via (Docker, npm, n8n cloud, desktop app): Docker/easypanel
  • Operating system: Ubuntu Digital Ocean droplet

Here’s a video (no sound):

Hey @Josh_Fialkoff,

The video isn’t loading, Can you share a screenshot? From what I can see when I open the nodes is I have an option to add the credential which asks for the host for ollama. Are you expecting more than that maybe?

The video shows the ollama url of localhost:11434 and a browser resolving to the URL with a status message that ollama is running.
But despite that, the node doesn’t complete successfully.

Hey @Josh_Fialkoff,

Are you running ollama in the same docker container as n8n? When you use localhost it is using the localhost address of whatever is running so localhost in an n8n container will be localhost in that container, Localhost from your machine would be localhost on your machine.

You would need to make sure that Ollama is correctly configured to listen on a specified IP then use that IP address to connect.

Thanks for your help, @Jon!

Ollama is run locally on my Mac.

n8n is running on a Digital Ocean Ubuntu droplet via EasyPanel.

So, the problem is that Ollama needs to be on a static iP?

I am working on getting ollama to listen to a specific IP @Jon. Thanks again!

Hi, no I had Ollama & N8N running locally on my Mac in Docker.

I use https://orbstack.dev for running containers. So it was more tweaking to get them both communicating. Turns out running Ollama with their own developed app and calling their api went better.

Hey @Josh_Fialkoff,

Running it like that will introduce more issues as the Digital Ocean container likely won’t be able to access your Mac.

In your case you would need to open the port or use something like ngrok to expose your ollama service then you will be able to connect to it. If you had n8n and ollama on the same machine even in containers it takes a little bit of work due to how containers work and the default ollama settings but it isn’t that bad.