I got the data from a google sheet. Than I hope to copy the data into a new google docs.
However, the data up to 60 . If I use HTTPS request , it must use 60 nodes.
Is there any better method to Insert lots of data into a google docs table by 2~5 nodes?
Thanks.
Not yet. But if I have more than 100 rows of data, it would mean creating 100 separate nodes which doesn’t seem like the best approach.
For now, I’m using a code node to aggregate all the data into a single JSON object, and then passing that to an HTTP request node.
Well, the normal use case is to use single Google Sheet node to handle any number of rows (within n8n server capacities of course, but 100 rows is nearly nothing to worry about), even when number of rows is not know beforehand.
Can you share your current workflow here (using </> button)?
Maybe just tiny tweak of data structure you use for your data will resolve the issue.
Hard to guess and suggest a solution without really knowing what you have at hand.
Thanks for your reply!
Here’s a quick overview of my workflow:
1. I read data from a Google Sheet.
2. perform some data analysis/processing in n8n.
3. Then write the processed results into a table inside a Google Docs document.