Neebie here.
I’ve setup a workflow to pull documents from a Google drive and submit to Airparser. A proof-of-concept worked (trial version), but trying to enter test production stage (licensed version) it won’t automatically pickup the files from the Google drive to process. Scenario:
Test Objective: Process 54 files (mixture of PDF and DOC)
I had the Poll time set to once / day before I uploaded the files to Google Drive
I uploaded 54 files into the target Google folder. Files had various time/date stamps, 4 happened to be dated the day of the test.
I manually Execute Workflow and it processed only 1 file, which happens to be the newest time/date stamped file (dated same date as my test)
I set the Poll time to Every minute and waited several minutes. It didn’t process any more.
I manually Execute Workflow again and it processed the same single file again.
To debug. I deleted 7 files (all the files starting with ‘b’) from the google folder and waited 5min. In that time it didn’t process any of the other 46 files. I reuploaded the 7 files to the google drive (none of which have a time/date stamp of the day of testing) and waited 50min for the Node to automatically trigger, which it never did.
Upon reading community support documents, I found one that appeared related ( Google Drive Trigger does not work? ) where someone suggested “you want the trigger to activate when a file is uploaded shouldn’t you use ‘Folder Updated’ instead of ‘File Updated’?” which resolved the persons problem. I tried that, waited a couple hours, and so far the Node hasn’t processed any more documents.
Thanks for your help.
Welcome to n8n! @rstomp I can see the issue with your Google Drive trigger setup. The problem is that Google Drive Trigger only detects NEW changes after it’s activated, not existing files that were already in the folder.
The files you uploaded, i.e., 54 files BEFORE activating the trigger, were already available when the trigger first polled, and hence, they were not treated as new changes.
The file you mention, which got processed, is probably the latest file for which your trigger’s detection logic matched.
The Solution: Process Existing Files
You will need to use a different method to process the files that are already in the folder:
Use Schedule Trigger + Google Drive List (Recommended for Batch Processing)
Replace your Google Drive Trigger with this setup:
Schedule Trigger (runs every X minutes/hours)
↓
Google Drive: List Files in Folder
↓
Filter: Only unprocessed files
↓
Loop through each file
↓
Google Drive: Download File
↓
Airparser: Submit Document
↓
Mark file as processed (move to "Processed" folder or add to tracking)
Node 1: Schedule Trigger
Set to run every hour (or whatever frequency you need)
This will check the folder regularly
Node 2: Google Drive - List
Operation: List
Folder: Select your target folder
Options:
Limit: 100 (or however many files you want to process per run)
If you are not sure of the file, just create a folder on drive and keep a trigger that if there is some change there the workflow would trigger, and download that changed file and parse it down to your flow.
Hey welcome, the Google Drive Trigger works off timestamps so it only picks up files that were created/modified after the last time it polled, it doesn’t retroactively scan everything already sitting in the folder. That’s why you only got the one newest file each time. Switching to “Folder Updated” won’t fix this either since it uses the same timestamp logic under the hood. For processing a batch of existing files like your 54, you’re better off using a Schedule Trigger connected to a Google Drive node set to “Search Files” targeting that folder, that way it actually lists everything in there and you can filter/process however you want. Once you’ve cleared the backlog you can switch back to the trigger for ongoing new uploads, just make sure the workflow is active before files land in the folder.
This is a really common trap with the Google Drive trigger — ran into this exact thing when I built a document processing workflow for my own business.
The core issue: n8n’s Google Drive trigger uses a polling mechanism that tracks files by modification timestamp. When you upload files with old timestamps, n8n sees them as “already processed” because their modified time is before the polling checkpoint it stored when you first activated the workflow.
Why only 1 file was picked up: That was the only file with a timestamp newer than when you activated the workflow. The trigger stores a “last checked” time and only surfaces files modified after that point.
How to fix this for your existing 54 files:
Reset the polling checkpoint: Deactivate the workflow → delete its execution history → reactivate. This wipes n8n’s “last seen” timestamp so it starts fresh and should pick up everything in the folder.
Better for bulk imports: Don’t rely on the trigger for bulk uploads at all. Use an HTTP Request node calling the Google Drive API directly — with a query like 'FOLDER_ID' in parents and mimeType != 'application/vnd.google-apps.folder' — to pull all files and pipe them through. Full control, no timestamp games.
For future uploads: Once you sort out the backfill, the trigger will work correctly for new files going forward.
My setup: I use the Drive trigger for ongoing new files, but I have a separate “backfill” workflow with an HTTP node + Drive API that I can fire manually when I need to process historical files. Way cleaner than fighting the trigger’s timestamp logic.
The Airparser integration itself should be fine — this is 100% a Drive trigger timing issue, not an Airparser problem.
Thanks for your suggestions. Appreciated. I understand the timing logic better now.
If I Deactivate the “Watch Google Drive Folder” Node, delete its execution history, then Reactivate it, will I still not be facing the same issue? My 54 files are ALL timestamped OLDER than the time I will now Reactivate the “Watch Google Drive Folder” Node?
I have a few thousand more old files to process, but will soon start introducing new files as part of my updated business process. As such, I like your option 2 “backfill” workflow and will investigate.
Hi @OMGItsDerek, #2 worked, although I had to add a Split Out node (I had DOCs and PDFs(, my initial batch of 54 files all processed right though Airparser. Thanks again.