Sharing three workflows I built for a live AI workshop. They fire
automatically during a presentation at configured slide numbers,
but they work perfectly as standalone pipelines too.
Email Pipeline
Classifies incoming emails via Claude, drafts a reply, routes
escalations to the right person, logs everything to Sheets.
Key nodes: Gmail Trigger → Claude (Haiku) → Code (parse JSON) →
IF (escalation check) → Gmail → Google Sheets
Meeting Pipeline
Takes any meeting transcript, extracts action items + decisions +
biggest risk, builds a follow-up email, sends to all attendees
pulled from a Sheets roster.
Key nodes: Form Trigger + Webhook → Claude → Parse →
Get Attendees (Sheets) → Gmail + Slack + Sheets log
Evidence Intelligence Engine
This one’s more involved. Research question goes in, a structured
evidence brief comes out.
Claude first decomposes the question into a search plan.
Perplexity runs web + academic search in parallel. Claude evaluates
evidence quality — if insufficient, it refines the queries and runs
again (max 2 iterations). Final synthesis written to a Google Doc
and posted to Slack.
Key nodes: Form Trigger → Claude (Opus) → Perplexity ×2 →
Merge → Claude (evaluator) → IF (quality gate) → Claude (writer) →
Google Docs → Slack + Sheets
All three JSON files are in the repo under /n8n.
Import → reconnect credentials (Anthropic, Gmail, Sheets,
Slack, Perplexity) → toggle Active → done.
Full repo (includes the Python orchestrator that triggers these
during a live presentation):
Let me know if you have questions on the Evidence Engine loop —
the quality gate logic took a few iterations to get right.