Integrating oTTODev with MaxAI.me for Prompt Interaction

I currently have an unlimited plan with the MaxAI.me platform, which allows me to use different AI models from various providers. However, this plan only permits access to the MaxAI.me chat interface and does not allow direct API usage.

At the same time, I am using a forked version of Bolt.new (oTTODev), which supports various integrations, such as using local LLMs, LLMs via APIs, and other functionalities.

What I want to achieve is the following workflow:

  1. When I send a prompt from oTTODev, the prompt should be sent to MaxAI.me chat.
  2. MaxAI.me chat should process the prompt and generate a response.
  3. The response from MaxAI.me chat should then be sent back to oTTODev, where it will be displayed as the output.

I am wondering if this workflow can be implemented using n8n (a workflow automation tool). If so, how can I set it up to achieve this functionality?

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

hi @J11

Sorry for the late response here and welcome to the community! I’m not very familiar with oTTODev and how it operates but if you’re able to capture the prompt and send it to n8n you should be able to communicate with MaxAI’s chat.

However, if they don’t allow API access it tells me they don’t allow you to use this very same use case so you might be breaking their Terms of Service by doing so.

I suggest talking to their support on how and if you can operate this way.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.