How I Solved the "Forbidden" Error When Integrating n8n Cloud with Local Ollama Using an Intermediary Proxy

Hi everyone,

I wanted to share a solution that helped me resolve a “Forbidden” error when trying to connect the native Ollama Language Model node in n8n (cloud-hosted) to my locally running Ollama instance.

The Problem

My local Ollama instance (running on port 11434) was rejecting requests coming from n8n Cloud because they did not include the expected Origin header. Since n8n Cloud doesn’t automatically send that header, Ollama was returning a 403 Forbidden error.

The Solution: Using an Intermediary Proxy

I created an intermediary proxy using Node.js, Express, and the http-proxy-middleware package. This proxy intercepts requests from n8n, injects the correct Origin header, and forwards the requests to my local Ollama instance.

Proxy Configuration Code

Below is the complete code that worked for me:

const express = require('express');
const { createProxyMiddleware } = require('http-proxy-middleware');

const app = express();

const proxyOptions = {
  target: 'http://localhost:11434', // Target: local Ollama instance
  changeOrigin: true,
  onProxyReq: (proxyReq, req, res) => {
    // Inject the Origin header required by Ollama
    proxyReq.setHeader('Origin', 'https://bushido-academy.up.railway.app');
  },
  logLevel: 'debug'
};

// Use '/' so that all requests are forwarded directly to the target
app.use('/', createProxyMiddleware(proxyOptions));

const PORT = process.env.PORT || 3001;
app.listen(PORT, () => {
  console.log(`Proxy running on port ${PORT}. Redirecting to http://localhost:11434`);
});

How It Works

  1. Local Ollama Instance:
    My Ollama service runs locally on port 11434.

  2. The Proxy:
    The proxy runs on another port (in my case, 3001). When a request is sent to the proxy, it injects the Origin header with the value https://bushido-academy.up.railway.app (which is the URL used by my n8n Cloud instance) before forwarding the request to Ollama.

  3. Exposing the Proxy Publicly:
    I then used ngrok to expose the proxy publicly. For example, running:

    ngrok http 3001
    

    provided a public URL like https://552f-2804-14c-10d-802e-7059-bf93-c75f-a90e.ngrok-free.app.

  4. Configuring n8n:
    In n8n, I set the Base URL for the Language Model node to point to the public URL of the proxy (e.g., https://552f-2804-14c-10d-802e-7059-bf93-c75f-a90e.ngrok-free.app). This way, when n8n sends a request, it goes to the proxy, which adds the proper header and forwards it to my local Ollama instance.

The Outcome

After configuring the proxy and updating my n8n credential, the “Forbidden” error was resolved, and my workflows now successfully interact with the Ollama API.

I hope this solution helps anyone facing a similar issue! If you have any questions or suggestions, feel free to comment.

Happy automating

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system: