The idea is:
Add a node for models hosted on WatsonX.AI to use in LLM-chains and agents.
My use case:
We host our custom models on WatsonX.AI, which we want to use for inference within N8N. Since they are not completely conform OpenAI spec, there is no way to just change the base_url of that node.
I think it would be beneficial to add this because:
IBM provides data-zone specific LLM’s in the EU, which is valuable for the customers situated there.
Any resources to support this?
https://ibm.github.io/watsonx-ai-python-sdk/fm_model_inference.html
Are you willing to work on this?
We don’t have the development skills for this.