Storing data temporarily to avoid hitting API rate limits?

Hey guys,

I have a workflow that makes an API request that returns a list of 15 items (per request/execution) which I then easily add to Google sheets.

I’m running this request in a loop with different input to get 15 different items each time and add them in a Google sheet. The issue I’m facing is that after around 30 seconds Google sheets returns an error saying I hit the 60 requests/min read limit.

The loop runs dozens of times per minute, and I know I can use a wait node but I don’t feel this is the most efficient way because I want the loop to continue to run and “accumulate” data and then send it to Google sheet every 100 items or so instead of every 15 items.

For this to happen, I need to “store” those items somewhere, keep running the loop and adding more items, then once we reach say 100 items, add to google sheet, clear that “storage” and repeat.

Is there a way of doing this in n8n? If not, what do you suggest doing in this case? If I use a “wait” node, that just stops the rest of the workflow from running and creates a bottleneck.

Appreciate your help!

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

We’ve created a new category for help with designing workflows, and I’ve moved your question there: Help me Build my Workflow. Find out how this category works by reading this topic.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.