Hi everyone,
I am running docker to use the AI starter kit. I’m setting up an AI Chatbot. I upload PDFs from Google Drive to Qdrant. Then, the chatbot gets the information from Qdrant and answer.
I want to deploy it using Digital Ocean, I’m still considering it, but Digital Ocean seems straightforward to use. What do I need to consider to deploy it?
In the n8n docs, it says to secure and harden it before using in production. What to consider to secure and harden it?
Thank you!
Workflow
Information on n8n setup
- n8n version: 1.88.0
- Database: SQLite
- n8n EXECUTIONS_PROCESS setting: own
- Running n8n via: Docker
- Operating system: Windows 11 Pro
Hey YuQinghao,
After spending some time myself going through the docs and researching what I can tell other people are doing, here’s what I’ve compiled as a v0.1 of how I would “secure and harden” n8n in a production environment like Digital Ocean:
To deploy the AI Starter Kit on DigitalOcean:
- Use a Droplet with at least 4 vCPU / 8GB RAM for AI workloads.
- Install Docker & Docker Compose, clone the repo, and run docker-compose up -d.
- Use a DigitalOcean Volume for persistent storage (PDFs, embeddings).
- Set up a domain + HTTPS using Caddy (included) or Nginx + Let’s Encrypt.
To secure and harden n8n:
- Force HTTPS — Caddy (included in the starter pack) will handle this automatically.
- Enable auth — Set N8N_BASIC_AUTH_ACTIVE=true and add a username/password in .env.
- Limit access — Use DigitalOcean Cloud Firewalls to restrict ports/IPs.
- Keep it updated — Regularly update Docker images and apply security patches.
- Back up data — Use volumes and snapshots.
- Review workflows — Make sure no sensitive data is exposed.
Biggest thing to recognize is that LLM powered endpoints can be exploited by bad actors if you’re not careful.
@tsellhorn As… the industry seems to be phasing out on Docker (in favor of a standards based aproach like Podman), in your research have you seen anyone using Podman for this approach?
That would also allow using k8s (it it really needs to be scaled up).