I have created a simple workflow: at a certain time of the day, connect to Gdrive and download all the files from a folder (in a loop) and upload them to a FTP location. This works fine.
However i have reached a roadblock: if the folder has more than 1000 files, the system stops. How do i bypass this limitation? I am talking of small files with an average size between 20KB-150KB each.
Also, if the folder has subfolders, the system gives me an error message and stops.
How do I create a workflow that, after connecting to Gdrive, allows me to download all the content from a specific Folder (including subfolder and their files) and upload everything in a single step to an FTP location? This is similar to what I am doing with the files list, but this time, it includes all files and folders/subfolders.
Interesting use case! So, if you’re saying this fails around a certain file threshold it could be related to memory and resources in general, compute power. Are you hosting n8n or using our cloud? If the former you have more control here.
I’m curious as well, not to overcome the 1000 files limit but to upload all documents contained in a google drive folder included files contained in sub-folders… to add all files in a vector store.