Hey everyone!
I’m working on an automated social media content generation workflow for several brands I manage as part of my business. Right now, I’m using n8n combined with Google Sheets as my database of example posts, categorized by brand.
How it works: When I want to generate a new post (via Telegram), the workflow searches the Google Sheet for matching examples (based on brand, etc.), builds an enriched prompt, and sends it to GPT-4, which generates two new posts in the same style.
Question: I want to improve this system to make it smarter, more scalable, and capable of learning over time.I’m not sure whether I should stick with my standard logic-based workflow in n8n or move to a RAG (Retrieval-Augmented Generation) setup using Supabase + pgvector, which might offer better performance.
Has anyone here implemented something similar?
Thanks in advance