VERY URGENT: Adding a Delay within "Google Drive: Copy File" Module regarding API limits

Hi guys,

This one is really important as it currently limits me in the ability to scale this process which was the only reason I did it in the first place.
Normally adding a delay in a workflow to match API limits wouldn’t be a tough job but here it seems like the limits are within one module. So from the entire workflow basically just the Google Drive Copy File makes Problems as for every row in the Google Sheet it creates a new Google Slide Presentation from the Template, which is then filled in with values and so on … rest is not important here.

Problem seems to be that these Documents are created too quickly so even when having just 9 rows it already made problems giving me 400er errors regarding API limits.

Does anyone have any idea how to limit the requests here so that it doesn’t copy the templates all within one second it feels like?

Screenshot 2025-01-23 170908

  • n8n version: 1.69.2
  • Database (default: SQLite): QLite
  • n8n EXECUTIONS_PROCESS setting (default: own, main): own, main
  • Running n8n via (Docker, npm, n8n cloud, desktop app): self-hosted in google cloud
  • Operating system: Windows10

usually the answer to rate limits is to use the Loop over Items node to process all the calls into batches with a pause in between. If you check out the link you may be able to figure it out from the docs and add that to your workflow - I’ll see if I can find some good examples for you.

Here’s an example - check out the noted area. The loop collects the input and , with the batch size set to ten, passes over the first ten items to the (in this case) calendar delete node. When these are done, that triggers the Wait node to include a short delay. When that’s done, it signals the Loop Over Items node to start on the next batch. When the queue is empty, that triggers output on the ‘Done’ connection.

1 Like

@nickv Thanks for the reply.

Well, in my case the BatchSize would be the number of rows of the Google Sheet since so is there an elegant way to do that?

But besides that just putting it in front of the Google Drive Copy module would do it?

Greetings

Ah - the batch size setting is how many get processed at a time, not the size of the queue.

Say you have 28 rows coming in as items. The first time they hit the loop node, the first 10 (or whatever batchsize you set) get passed out on the loop and get processed. When control goes back to the loop node, it takes the next ten off the stack until they are all gone.

2 Likes

So simply setting this infront of the Google Drive Copy file module then select one as batch size and it automatically passes just one file per what ever I set as wait after the copy file? How does it know what is defined as one file?

Yes, that is basically it.

How does it know what is defined as one file?

You’ll find that data like this, passed from previous nodes, is usually in the form of a list. For example:

[
  {
    "filename": 
    "foo-report"
  },
  {
    "filename": 
    "bar-report"
  },
  {
    "filename": 
    "baz-report"
  }
]

This is one output, but it contains a list of three distinct items. LoopOverItems does exactly that, it reads the list and takes ‘n’ items (where n is the batchsize you set) and passes it through to the output. Most of the nodes are able to handle data in this way. The Loop node is different in that it doesn’t process the whole list in one go, which is what was causing your problem.

1 Like