Large API call, 3000 records to populate Google sheet daily or more frequently

Hi all,

I was wondering if it was viable to use N8N to pull about 3000 records multiple times a day to populate (and overwrite) a google sheet. The idea is to have an auto updating Google sheet of our latest CRM data so we can have auto updating reports in the other sheet tabs that also update daily based on the data in the first sheet.

As I understand with integromat for example while I can pull 3000 records from the CRM api in 1 call, ill need to post each item 1 by 1 into the Google sheet so that ends up being 3000 operations or more per update or more, which is not viable.

Would the same issue occur with n8n or is there a way to dump a large amount of records into a google sheet at once in 1 operation? Any advice on how to achieve this would be great. Thanks!

Hi @jay377

Not sure about the google API.
But n8n would have no issue with those kinds of records as long as the amount of data stays within the limits for the RAM usage of n8n.
I know of an example where 50k items were pushed from one system to another, which worked on n8n Cloud. But This was reaching n8n clouds limits for the RAM usage.
So 3k records should be easily doable. Just keep in mind not to use too many nodes, as they will make n8n use more ram for all data that goes through it.
If you are on a privately hosted solution or Desktop, you should have some more headroom to play with large data sets.

Hope this helps.

1 Like

Hey @jay377,

You should be ok, Looking at the code for the node if you are doing an update we use the batch method so it should try and send all the data in one hit.

1 Like

Great thanks to both of you, appreciate it, thats great news.

So just to clarify technically in terms of your costs and operations/workflow executions, pulling 30k records from an api and then using the google sheet module I can batch post/save the data all in 1 go to google sheet, so this will end up being just 1 operation/execution for each time i need to do a 3k rows update? Thats awesome

1 Like

Hey @jay377,

If you are using cloud an execution is counted as a workflow start so if you start a workflow and that processes 3000 records it is still only 1 execution. If you were to use a trigger that launches the workflows once for each record it would be a very different story.

If you are planning to use cloud it would be worth signing up for a trial and giving it a bash or you could use the Desktop version to build out the workflow just to see how it would work then you can export the workflow and import it into the Cloud version (or just copy / paste the nodes).

1 Like

Ok thats sounds perfect thanks, will give it a try. Yes fortunately we have no need to do individual rows, we only need to regularly dump the entire content of a CRM DB (via api) into a google sheet, so this sounds like it will work ideal for us

1 Like

If the load is to high and you can make the flow know what you have processed you can setup a interval node.

So every interval triggers the flow, the flow fetches the still open rows and when completes the insert in sheets it updates the open rows.

This way I process 60.000 big json requests with a MySQL table in between as a process table.

If you want I can send you an example.

UPDATE: I found Split In Batches and successfully did it. Wonderful!

Hi, I’m new here. My use case is a bit different from you, but I’m also calling from an API and iterate through rows, so I hope to get the answer here. Thanks in advance for any help.

I have a Google Sheets file of “codes”, and an API that I will query the info from based on that code. I have 8000 rows of data to update quickly.

I also came from Make, in Make I would:

  • Search for all rows, as long as they have a code in there.

  • Iterate through each code, construct the API URL to query, use a HTTP GET call to get the info I want.

  • Update the pieces of info I want from the call’s response.

Now I’m trying to do the same thing in n8n.

  • Read the Sheet
  • Do a HTTP request
  • Lookup the row to update
  • Set the new info
  • Update the actual row

It works, but only for 1 row. Can you tell me how to do this for multiple rows? How do I iterate?