I’m having some trouble using open router as a chat option for the any ai chains since there isn’t specifically for open router, and when I try using a http request I can never seem to get the auth header to register correctly. anyone have any ideas on how to fix this?
Could you share your workflow (by pasting in the workflow json) or screenshots especially around how you’re sending the http request with the HTTP node?
Comon guys! OpenRouter is 100% compatible with OpenAI API Rest structure
I don’t want to use HTTP request as I lose “streaming” response functionality
A very simple is to implement this in OpenAI credentials where base URL by default points to openai otherwise we can use another endpoint and load openrouter models
@TheKore thanks for your Collaboration and your PR
personally I’m just amazed by OpenRouter
they propose a multiple LLM,
they even propose to filter which LLM to data collection