Help with request logs for ai model

Hi,

Is there a way to log the http request sent to my local ollama model?

hello @Orionpax

you can create a proxy like traefik/nginx/etc between n8n and ollama to log all activity

1 Like

@Orionpax Create a workflow with HTTP Request node + logging to capture all requests/responses automatically.

2 Likes

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.