Does the YouTube node use too much Google API quota?


I unexpectedly ran into a YouTube API quota issue yesterday where I completely ran of of my 10,000 daily allowed operations. I’m using the ‘Get Many’ operation to fetch recent new uploads to a channel:

Today I started testing why this happened, and I ran this for 5 channels total:


This resulted in 24 videos being found. The total used quota was 500, and I can’t figure out why it would be this many. According to the YouTube Data API (v3) - Quota Calculator all actions that I think apply here such as list channel or list video, only cost 1 quota point, so how do I end up at 500 for this operation?

To be clear, the quota use was zero before my test and I ran this only once.

I wonder if the node is doing something unexpected/unnecessary behind the scenes that causes it to burn through the quota this quick?

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:
  • n8n version:
    • 1.15.2
  • Database (default: SQLite):
    • default
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
    • Not sure
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
    • docker
  • Operating system:
    • Ubuntu 20.04.6 LTS

Hello, @bartv,
The YouTubeAPI doesn’t really charge anything as it’s free. But you mentioned not knowing why you hit your limit. It really depends on how many API calls are being made and what information is coming back. Some calls are 1 unit, 500 units for others, and so on. (pardon me while I look for an article I read earlier)

I realise that, but you’re limited to a quota of 10,000 ‘points’ per day. My question is why such a seemingly small operation would cost 500 points, if all the relevant actions seem to cost only 1 point each, see this link:

GitHub - pauljnorg/youtube-transcript-api: This is a python API which allows you to get the transcript/subtitles for a given YouTube video. It also works for automatically generated subtitles and it does not require an API key nor a headless browser, like other selenium based solutions do! is my fork of an API written by a really talented gentleman. It uses Python though. But reading throuh how to use it also leads you to understand how to budget your Google API credits with YouTube.

Jonas Depois(sp?) I’m hoping I’m not butchering his name. His code is both on the Git hub and PyPI sites.

Youtube-transcript-api · PyPI. (n.d.). PyPI. Retrieved November 23, 2023 from youtube-transcript-api · PyPI

They probably don’t want to have it run away with traffic.
Have you read that one work around was to have an extra API key lined up in code?

I think this is all unrelated to my core question - I think the YouTube node is consuming WAY more points than it’s supposed to.

I agree, can you tell what specific API call is costing that many units?

I’m dozing off fast. Have a good night. I’ll be back later today.

Bart I had one more idea for you,
If you don’t mind working some rewrite to Javasrcript you could use the YouTube APIs, but here’s the catch. They are using pure HTML scraping to get the information. The API looks really clean. But the scraping may go against YouTube’s terms of service.

Update: after digging a bit in the YouTube node code I realised it’s using the search.list endpoint, which costs 100 ‘points’, so the total makes sense.

HOWEVER - it seems that using search.list is a wasteful approach to this, and I found another solution on Stack Overflow, which queries the uploads property of a channel, to then get all PlayListItems of this list. This solution only uses 2 quota points instead of 100.

It does come with some drawbacks, such as not being able to filter based on upload date etc, but that’s easy enough to handle.

PS: are workflow embeds supposed to have a black background? :thinking:


Good sleuthing! I would love to know how to share my workflows like you are where they are openable like that per node config. Mine all paste in as JSON text. How do you do that?

My workflows have this quadrille paper look to them. With dot’s for the corners of squares instead of the lines.

I can see the div holding the workflow just not how to replicate it.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.