[Hiring] Expert to complete Automated YouTube News Workflow (n8n + OpenAI)

Detailed Project Overview

1. Completed Work:

  • Infrastructure: I have a self-hosted n8n environment running via Docker/Portainer (gbn-automation stack) on a VPS. My instance is live and stable.

  • Workflow Architecture: The core logic is already mapped out in n8n. I have placed the following nodes: Schedule Trigger, RSS Read, Limit, OpenAI (Message a Model), and Google Sheets (Append Row).

  • Integrations: Basic credentials for Google Sheets and OpenAI have been initiated.

2. Assistance Needed:

  • Data Mapping & Connections: I am experiencing some UI/UX difficulties in connecting the RSS Read output to the Limit node and ensuring the data schema flows correctly into the OpenAI node.

  • Prompt Engineering: I need help refining the OpenAI prompt to generate high-quality, YouTube-ready news summaries from the RSS descriptions.

  • Data Formatting: Ensuring the output from the AI is correctly parsed into specific columns in Google Sheets (e.g., Date, Title, Summary, Source URL).

  • Error Handling: Implementing “Wait” or “Error” nodes to ensure the workflow doesn’t break if an RSS feed is temporarily down or the AI API hits a limit.

  • Future Expansion: Guidance on eventually connecting this to a video creation tool or the YouTube API directly once the script-saving process is perfect.

Objective: I am looking for an expert to spend a few hours with me (or asynchronously) to “clean up” the connections, verify the data mapping, and ensure the workflow runs 100% reliably from start to finish.

3 Likes

Hi,

I’ve reviewed your current setup. The difficulties you’re facing with connecting RSS to OpenAI usually stem from how n8n handles item arrays (JSON mapping). I can fix your data schema, refine your prompts for YouTube-ready scripts, and make the workflow 100% reliable.

How I will help you:

  • Data Mapping: I’ll implement Item Lists / Split in Batches nodes to ensure every RSS entry is correctly parsed and sent to OpenAI without data loss.

  • Prompt Engineering: I will design a structured System Prompt for OpenAI to generate high-retention YouTube scripts (Hooks, Key Facts, CTAs) instead of dry summaries.

  • Reliability: I’ll add Error Trigger nodes and “Wait” logic to handle API rate limits and dead RSS feeds gracefully.

  • Future-Proofing: I can advise on connecting this to InVideo / HeyGen API or YouTube API for a fully autonomous video factory.

Terms:

  • Format: I can work asynchronously (you send the JSON, I return it fixed) or via a live session.

  • Estimated Time: 2–4 hours of focused work.

  • Rate: $40

Portfolio & Contact:

2 Likes

Well thats great , can we have a whatsapp call +8801618884435 and to explain you my project requirement..

I can help you with my prior experience in AI automations, and I have also built my own saas cvscrenner.com and kick-started the AI blog posts machine to write a blog post every day with seo upbringing format hope this knowledge and exp in this can help you

Hey,

You don’t have a “connection issue” — you have a pipeline design problem.

What you’ve built is a working demo, but it’s not structured for clean data flow or consistent AI output. That’s why you’re hitting friction between RSS → OpenAI → Sheets.

I’ve built similar pipelines, and the fix is straightforward once the architecture is corrected:

  • Proper item handling (Split in Batches + normalized JSON structure)

  • Controlled prompt design (so output is predictable and column-ready)

  • Clean parsing layer before Google Sheets (no messy AI dumps)

  • Fail-safe logic for RSS failures + API limits (so it doesn’t silently break)

More importantly — if your goal is YouTube, the current “summary → sheet” flow is incomplete.

You actually want:
RSS → Structured Script (Hook + Segments + CTA) → Storage → Video/Voice Layer

Otherwise you’ll just generate boring summaries that don’t convert.

I can:

  • Fix your current workflow fast

  • Turn your prompts into YouTube-ready scripts (not generic summaries)

  • Structure the output so it’s ready for video automation later

Portfolio (relevant builds + demos):
https://muhammad-ai-automations.notion.site/Muhammad-Bin-Zohaib-AI-Automation-Projects-29da292a241380f889c2e337a134c010

If you send your workflow JSON, I’ll point out exactly where it’s breaking and what needs to change.

– Muhammad

Hello @ngncore

I made this video specifically for you: [Hiring] Expert to complete Automated YouTube News Workflow (n8n + OpenAI) - ngncore | Loom

I’ve been building n8n workflows for 6 and 7-figure businesses for 2 years and the video shows exactly a few of them.

Also here is my portfolio: Fran´s Portfolio - Google Präsentationen

Shoot me a message and let’s get started. [email protected]

Fran

P.S.: The video is 2 minutes. Worth it.

the rss → openai data mapping issue is almost always the same thing: you’re not referencing the rss output fields explicitly inside the openai message. use {{ $json.title }} and {{ $json.description }} (or content depending on the feed) directly in your prompt template, not just a passthrough.

for clean sheets output, tell the model to return json with a fixed schema — something like { "hook": "...", "summary": "...", "source_url": "..." }. set response_format to json if youre on gpt-4o, then map the json keys straight to columns in the append row node. way more reliable than parsing free text after.

error handling: set Continue on Error on the openai node so a rate limit limit hit doesn’t kill the whole run. the Error Trigger is good for notifications but wont save a mid-run failure on its own.

if the architecture gets more complex when you add the video layer, feel free to reach out.

Hey — this is exactly the kind of project we do at Neonotics.

We build and debug n8n + OpenAI workflows for clients. Self-hosted n8n on Docker/VPS is our default environment, so no ramp-up needed.

A few questions to scope this quickly:

- What’s the current workflow doing vs. what it should be doing?

- Where is it breaking — is it a node config issue, API auth, or logic flow?

- What’s your target output format (YouTube video descriptions, scripts, news summaries)?

We can typically complete and test a workflow like this in 2–4 hours once we have access. Happy to do a quick async review of your current workflow first so you know exactly what’s needed before committing.

— Neonotics

Hi, This is a strong match for my background. I work with n8n, OpenAI, RSS workflows, and Google Sheets automations, so I can help clean up your existing setup and make it run reliably.

From your overview, the main need is fixing the node connections and data mapping, improving the prompt for better YouTube-ready summaries, formatting the output into clean Google Sheets columns, and adding error handling so the workflow does not break on feed or API issues.

:telephone_receiver: Book a quick call:Calendly - Automaxion

Hey! This is exactly what I’ve been building.

I recently completed an automated news channel workflow in n8n — it handles content discovery, summarisation via OpenAI, and publishing. Happy to share a screen recording of how it works.

I’m a sysadmin with hands-on n8n experience (self-hosted), comfortable with webhooks, REST APIs, and multi-step agent flows. I also have a badge for completing the official n8n course.

What’s the current bottleneck in your workflow — the content sourcing, the AI summarisation, or the publishing step? I’d like to understand the specific gap before proposing anything.

Hey Mihail, this is right in our wheelhouse — we build and complete n8n + OpenAI workflows regularly, including content automation pipelines. We’d be happy to take a look at what you have so far and get it across the finish line. Feel free to DM us with the details or share what’s currently blocking you.

Hi @ngncore,

It sounds like you have a great foundation built already! Getting the self-hosted Docker instance live and the core nodes mapped out is usually the hardest part. I can definitely jump in and help you clean up the data flow to get this running 100% reliably.

Here is how we can tackle your specific roadblocks:

  • Data Mapping & Connections: I will help you properly configure the item loops (likely using the Item Lists node or precise expression syntax) so the RSS Read output passes cleanly through the Limit node and iterates correctly into OpenAI.

  • Prompt Engineering & Data Formatting: To ensure your Google Sheets append perfectly every time, we will refine the OpenAI prompt to use Structured Outputs (JSON mode). This forces the AI to always return exactly what you need (Date, Title, Summary, Source URL) in a predictable format, eliminating parsing errors.

  • Error Handling: I can set up robust error management using the Error Trigger node and Wait nodes to catch API rate limits or RSS downtime, ensuring the workflow pauses and retries rather than crashing.

  • Future Expansion: When you are ready, adding a node to push these summaries directly to a video creation API or the YouTube API is very straightforward once this foundational data structure is solid.

As an AI Automation Specialist and R&D Lead, I architect complex, multi-step n8n automation workflows integrating AI models and external APIs daily. I am also fully comfortable navigating Docker-based environments.

I am available to hop on a quick screen share to do this live, or we can handle it asynchronously, whichever you prefer.

You can check out some of my automation work here:

Let me know if you would like to connect!

Hi @ngncore,

YouTube automation with n8n + OpenAI is interesting. I have built similar AI content pipelines on n8n.

What I can bring:

  • Experience building n8n workflows that integrate OpenAI for content processing
  • Webhook orchestration, HTTP nodes, and API integrations
  • Production-ready workflow design with error handling and monitoring
  • Fast turnaround: first working update within 12-24h

Happy to review what you have built so far and propose a path to completion. DM or email: [email protected]

Fabrizio — Altiora