Hi everyone,
I’m working on launching a new multi-modal AI Assistants SaaS service and I’m exploring using n8n as the workflow engine behind it. My goal is to create a generic workflow that can be leveraged by each of my SaaS users to automate various tasks (e.g., connecting to APIs, orchestrating AI models, etc.).
Initially, I’m expecting about 10,000 users within the first six months, but I plan (and hope!) to scale beyond 100,000 users as the product grows. I’d love to get some insights from the community on the following questions:
- Hosting Options
- Would n8n’s cloud/hosted version be sufficient for this level of usage, or would it be better to self-host n8n in my own infrastructure for more control over resources and scaling?
- If self-hosting, are there recommended best practices or existing guides for clustering n8n to handle potentially large volumes of concurrent workflow executions?
- Scalability
- Is n8n’s architecture built to handle the kind of load I might see from tens of thousands (eventually 100k+) of active users?
- Are there any known limitations or bottlenecks to be aware of when running n8n at scale (e.g., concurrency limits, database constraints, etc.)?
- Performance Tips
- For those who’ve scaled n8n to a large user base, do you have any tips or configurations that helped ensure smooth performance (e.g., caching strategies, job queuing, or horizontal scaling setups)?
- Implementation Approach
- Has anyone used n8n in a multi-tenant SaaS context before? Any suggestions or potential pitfalls when creating a “generic workflow” template that each SaaS user can customize?
- Would you recommend separate n8n instances per customer, or a single, more robust instance that handles all customers with strict resource isolation?
I’d appreciate any experiences, insights, or guidance you can share. I’m particularly focused on ensuring I choose the right hosting approach (cloud vs. self-hosted) and understanding how to best scale n8n to handle many users and potentially large workflow volumes.
Thanks in advance for your help, and I look forward to hearing your thoughts!