I am a first timer trying n8n and ollama using self-hosted ai starter kit. I am unsure if my n8n is able to call webhooks in general and specifically the ollama api end point too.
But when called the internally hosted ollama api end point it fails/ timeouts the request.
My only assumption is we have implemented a VPN (tailscale) to connect to our on premise server and only with whitelisted machines those api endpoints are accessible.
Even if I’am calling from a white listed machine not sure the end point know who is calling in a network level?
Any ideas or thoughts on how to proceed and possible help?
I wonder if its blocking certain port access? Are you able to see the logs in chrome/browser dev console when making the requests? Are there more details?