Download limits & subfolders

I have created a simple workflow: at a certain time of the day, connect to Gdrive and download all the files from a folder (in a loop) and upload them to a FTP location. This works fine.

However i have reached a roadblock: if the folder has more than 1000 files, the system stops. How do i bypass this limitation? I am talking of small files with an average size between 20KB-150KB each.

Also, if the folder has subfolders, the system gives me an error message and stops.

How do I create a workflow that, after connecting to Gdrive, allows me to download all the content from a specific Folder (including subfolder and their files) and upload everything in a single step to an FTP location? This is similar to what I am doing with the files list, but this time, it includes all files and folders/subfolders.

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

hi @life.tripod

Interesting use case! So, if you’re saying this fails around a certain file threshold it could be related to memory and resources in general, compute power. Are you hosting n8n or using our cloud? If the former you have more control here.

Regardless, this document walks through some suggestions you can use: Memory-related errors | n8n Docs

As for the second question, check out this reply: Iterating through google drive subfolders - #2 by ihortom

Hope this helps!

I’m curious as well, not to overcome the 1000 files limit but to upload all documents contained in a google drive folder included files contained in sub-folders… to add all files in a vector store.

something like that:

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.