I was wondering if it was viable to use N8N to pull about 3000 records multiple times a day to populate (and overwrite) a google sheet. The idea is to have an auto updating Google sheet of our latest CRM data so we can have auto updating reports in the other sheet tabs that also update daily based on the data in the first sheet.
As I understand with integromat for example while I can pull 3000 records from the CRM api in 1 call, ill need to post each item 1 by 1 into the Google sheet so that ends up being 3000 operations or more per update or more, which is not viable.
Would the same issue occur with n8n or is there a way to dump a large amount of records into a google sheet at once in 1 operation? Any advice on how to achieve this would be great. Thanks!
Not sure about the google API.
But n8n would have no issue with those kinds of records as long as the amount of data stays within the limits for the RAM usage of n8n.
I know of an example where 50k items were pushed from one system to another, which worked on n8n Cloud. But This was reaching n8n clouds limits for the RAM usage.
So 3k records should be easily doable. Just keep in mind not to use too many nodes, as they will make n8n use more ram for all data that goes through it.
If you are on a privately hosted solution or Desktop, you should have some more headroom to play with large data sets.
You should be ok, Looking at the code for the node if you are doing an update we use the batch method so it should try and send all the data in one hit.
Great thanks to both of you, appreciate it, thats great news.
So just to clarify technically in terms of your costs and operations/workflow executions, pulling 30k records from an api and then using the google sheet module I can batch post/save the data all in 1 go to google sheet, so this will end up being just 1 operation/execution for each time i need to do a 3k rows update? Thats awesome
If you are using cloud an execution is counted as a workflow start so if you start a workflow and that processes 3000 records it is still only 1 execution. If you were to use a trigger that launches the workflows once for each record it would be a very different story.
If you are planning to use cloud it would be worth signing up for a trial and giving it a bash or you could use the Desktop version to build out the workflow just to see how it would work then you can export the workflow and import it into the Cloud version (or just copy / paste the nodes).
Ok thats sounds perfect thanks, will give it a try. Yes fortunately we have no need to do individual rows, we only need to regularly dump the entire content of a CRM DB (via api) into a google sheet, so this sounds like it will work ideal for us
Hi, I’m new here. My use case is a bit different from you, but I’m also calling from an API and iterate through rows, so I hope to get the answer here. Thanks in advance for any help.
I have a Google Sheets file of “codes”, and an API that I will query the info from based on that code. I have 8000 rows of data to update quickly.
I also came from Make, in Make I would:
Search for all rows, as long as they have a code in there.
Iterate through each code, construct the API URL to query, use a HTTP GET call to get the info I want.
Update the pieces of info I want from the call’s response.