How can I share my ai agent workflow as OpenAI-compatible API?

Describe the problem/error/question

How can I share my ai agent workflow as OpenAI-compatible API?

Hello, @Vincentdu-cn !

Can you please specify your question?

Hi you can use OpenAI node and change your base url on credential setting:



Hello @Vincentdu-cn,

Could you please provide more details about what you mean exactly by “share AI agent workflow”?

Thanks for reply, But you misunderstood my question.LOL

Thanks for reply, “share AI agent workflow” means that I can access n8n’s AI Agent capabilities through OpenAI-compatible APIs in third-party AI software or frameworks, so that I can tweak the AI Agent’s tool and MCP in n8n’s workflows to give the AI all kinds of capabilities without the need for my front-end software to support the tool or mcp.

I would also like to know this.
Just to re-iterate, the question is how can we expose an OpenAI compatible endpoint that will act as the input to a n8n workflow, and will receive the output.

Just to bump this I would like to know too. Putting openwebui in front as a proxy feels wastefully complicated

To share your AI agent workflow as an OpenAI-compatible API, you can expose your n8n workflow using the HTTP Request trigger and format the input/output to match OpenAI’s API structure—accepting a JSON body with a messages array (like OpenAI’s Chat API) and returning a response in the choices format. Then, deploy this workflow to a public URL (using n8n self-hosted with a reverse proxy, or n8n.cloud), and document your endpoint for users to call it just like they would the OpenAI API.