Hi, I would like to do a deep copy of a folder on Google Drive. Documents of this template folder can have links that point to other documents of the copied folder.
At the end of the copy of all documents and folders, I would like to update relative links of the documents to point to new documents.
For example, consider the folder structure:
doc1.gdoc has a link that links to doc2.gdoc and vice versa.
I want top deep copy template to a new directory. At the end of this process, I want to update the link inside doc1 to point to the new doc2 file.
I already achieved to deep copy the folder but I don’t know how to wait the end of the copy (that use a recursion system) to start the link replacement using a set that is built progressively with the ids of all copied documents.
Here is a subtract of the workflow:
The first node is the initial command to indicate the source (template) and a destination folder.
It will then list files and folders of the source and copy the structure. In case of a folder it will recurse to copy the subfolder.
The ‘build replacement items’ node waits and builds a set of replacement items:
source document id > replacement id
It indicates the link reference for each document. The problem is that I don’t know how to consolidate all that replacement table before starting the link update.
The problem is probably the use of recursion and that I cannot merge from this recursion.
Some ideas that can help:
- how to wait for all node executions before starting a new one (and then using a code block to get output of all runs)?
- how can I control the order of node processing, by adding a node to execute after all others?