Is it possible? Google Chat Bot → Google Drive folder (with nested files) → Supabase via n8n

Hi everyone,

I have a Google Chat bot built in Google Apps Script and connected to n8n via a webhook.

  • For each incoming message in a Chat thread, the bot sends the text + space.name (used as sessionId) to an n8n webhook.

  • n8n processes it (using OpenAI agent) and sends a reply back to the same thread.

  • This works fine for regular text messages.

What works now:

I can already save individual Google Drive files to Supabase (either manually or via a Google Drive trigger outside of the bot context).

The problem / main question:
I want the bot to be able to handle Google Drive folder links, not just files.

  • A user could share a Drive folder in Google Chat (sometimes with nested subfolders and a variety of file types: Docs, Sheets, PDFs, images, etc.).

  • This folder might represent a whole project’s data.

  • The bot should process the link, recursively list all files inside (including subfolders), and store them into Supabase (Storage for binaries + metadata in Postgres, or another recommended approach).

  • Later, I want to be able to query the bot about that project, based on the stored files’ content.

What I’m unsure about:

  1. Accessing shared folders: Should I use OAuth 2.0 or a Service Account for this? I’d prefer Service Account if it can access files/folders shared with it by link.

  2. n8n implementation: Is it even possible in n8n to efficiently handle folder recursion with potentially large and deep hierarchies of files? How would you structure such a workflow?

  3. Best nodes/approach: Should I use the built-in Google Drive node for listing/downloading, or go with HTTP Request + Drive API for more control?

  4. Storing in Supabase: For mixed file types and potentially large files, is it better to use Supabase Storage for the binaries and keep metadata in Postgres, or store everything in Postgres directly?

  5. Processing different file types: How would you handle exporting Google Docs/Sheets to usable formats (text, CSV, PDF) before uploading to Supabase?

  6. Performance: Any advice for handling large folders (rate limits, batching, retries)?

Flexibility requirements:

  • Must support any file size and type (within reason).

  • Should work for both files and folders.

  • Should be able to handle nested folders.

Has anyone built something like this in n8n, or could share a best-practice workflow / architecture for it?
I’m mostly looking to understand if this is possible in n8n and how to structure the solution.

Thanks in advance!