Tunnel a Local n8n Instance for remote API requests & Webhooks

If you’re running n8n locally, things usually work great, until you need to trigger workflows by receiving API requests or webhooks from external services.

Whether it’s Stripe, GitHub, internal tools, or a SaaS that only communicates via API/webhook request, the problem is always the same:

  • You are experimenting with n8n on your laptop or desktop

  • External services can’t reach it

  • Exposing your local n8n setup requires a deployment

This post shows a simple, dev-friendly way to expose your local n8n instance using Inspectr, with full visibility into incoming API & webhook traffic.


The Common Local n8n Webhook Problem

In a typical local n8n setup:

  • n8n runs on http://localhost:5678

  • Webhook nodes expect a public URL

  • External services retry silently or fail without useful errors

When something goes wrong, it’s often unclear:

  • Did the API request/webhook arrive?

  • Were headers correct?

  • Did the payload match what n8n expects?

Inspectr sits in front of n8n, so you can see exactly what’s happening before the request hits your workflow.


What Is Inspectr?

Inspectr is an opensource lightweight API & webhook inspection & tunnel tool.

It allows you to:

  • Expose a public HTTPS endpoint for your local n8n

  • Just runs with 1 command. No complex setup or account registration

  • Forward requests to localhost

  • Inspect headers, payloads, and responses in real time

  • Debug webhook issues without changing your n8n setup

It’s especially useful during local development and testing.


How It Works

Flow:

  1. External service sends a webhook

  2. Inspectr receives it via a public URL

  3. Inspectr forwards it to your local n8n webhook

  4. You see the full request & response instantly


Step 1: Run n8n Locally (Standard Setup)

If you already run n8n locally, you can skip this.

NPX setup:

npx n8n

Or using Docker

docker run -it --rm \
  -p 5678:5678 \
  -e N8N_HOST=localhost \
  -e N8N_PORT=5678 \
  -e N8N_PROTOCOL=http \
  -e WEBHOOK_TUNNEL_URL=http://localhost:8080 \
  n8nio/n8n

Your N8N editor and workflows are now available at:

http://localhost:5678

Step 2: Start Inspectr and Expose Your Local n8n

Run Inspectr and point it to your local n8n instance:

npx @inspectr/inspectr --backend=http://localhost:5678 --expose

or with a preferred URL

npx @inspectr/inspectr --backend=http://localhost:5678 --expose --channel=n8n-demo --channel-code=n8ndemo123

Inspectr will output a public HTTPS URL, for example:

https://n8n-demo.in-spectr.dev

This URL is now reachable from the internet.


Step 3: Use the Inspectr URL in Your n8n Webhook Node

In your n8n workflow:

  • Create or open a Webhook node

  • Use the Inspectr URL as the base URL

Example:

https://n8n-demo.in-spectr.dev/webhook-test/inspectr

When the public endpoint is hit with the API/Webhook request

Inspectr transparently forwards the request to:

http://localhost:5678/webhook-test/inspectr

No changes needed inside n8n.


Optional: Inspect Webhook Traffic in Real Time

When a webhook is triggered, Inspectr shows:

  • Request headers

  • Raw JSON body

  • HTTP method & path

  • Response status & timing

This makes it much easier to:

  • Debug payload mismatches

  • Verify authentication headers

  • Understand retries or failures

You’re no longer guessing what reached n8n.


Why Use Inspectr for n8n Webhooks?

Compared to generic tunnels, Inspectr just requires 1 command to have a tunnel up-and-running, fully locally.

  • Built-in request inspection

  • Designed for webhook payloads

  • No account required

  • Minimal setup

  • Works well with retries & replays

It’s not meant for production ingress or replacement of N8N Cloud, just to make local development less painful.


Summary

If you’re running n8n locally and dealing with API/webhooks workflows, Inspectr gives you:

  • A public endpoint in seconds

  • Full visibility into incoming requests

  • Zero changes to your existing workflows

Docs & example:
https://inspectr.dev/docs/examples/expose-n8n-workflow/

If you’re already using a different setup Cloudflare Tunnel, reverse proxy, etc. I’d be curious to hear what works best for you and where it falls short.

Hopefully, this was helpful.

I use a service(I won’t mention it, but users can figure it out after UI)… which has some “limits” .

Is your product similar?

P.S since I am limited , I use the localhost to edit my workflow, and when I test I switch to the url provided by tunnel (so I don’t hit the limits)…

Cheers!

Since the traffic is just routed to your local instance, there are no request or traffic limits.

You can just give it a spin, it is just takes 1 command and no registration.

I use tailscale funnel since it puts it into public and you can also set it to only work with devices that you are signed into tailscale on, but i just funnel it since it’s easier. I was able to showcase my N8N ai email workflow to my class a few weeks back since it was public.