Just out of curiosity, have you tried accessing a file that you’ve created and not “shared” with you? Just trying to check if that’s causing an issue. It’s probably not it, but worth a shot.
Can you provide the full message returned by the Google Drive node?
Based on your description, it’s likely that the file you’re attempting to download is too large for this Google Drive node.
If you click the “From Google Drive” title in the “Error details” section below the error message, you will find more information. However, this data is encoded in a buffer, which you can decode using Buffer.from(data).toString('utf-8') in Node.
It conveys the same message as the picture above. The decoded message reads:
{
"error": {
"code": 403,
"message": "This file is too large to be exported.",
"errors": [
{
"message": "This file is too large to be exported.",
"domain": "global",
"reason": "exportSizeLimitExceeded"
}
]
}
}
which is quite self-explanatory.
To overcome this limitation, you can use the Google API directly through the HTTP nodes, as described in the workflow below.
You will also need to make the file publicly accessible.
Note : It is probably not the case for the slides, but for the Google Docs and Sheets, since the introduction of the “tabs” in google documents, the tab name is added in the first page. It is a known and already reported bug.
Please mark this answer as the solution if it solves you problem
Thanks for the great answers. If now someone knows how I can set up docker in a way that it does not auto store these files and that I can limit the RAM for the instance so that I can loop these events without crashing N8N on my private Google Cloud VM that would be great.
I am currently running it with the free tier 1GB of RAM which makes files as large as 90MB and upwards impossible to download without crashing things.