Extract VideoIDs from a Youtube Channel (more than 50)

Hello everyone!
We want to extract the list of videos from a Youtube channel (not my channel). We use the Youtube API for this. Unfortunately, we get a maximum of 50 results from e.g. 250. Unfortunately, I don’t know how this query can be supplemented so that all videos can be extracted.
(https://www.googleapis.com/youtube/v3/search?key=_my_API_key&channelId=_the_channel_ID_&part=snippet,id&order=date&maxResults=50)
changing the maxresults above 50 does nothing :wink:

Does anyone have an idea?

best from Berlin//Heino

we have selfhostet n8n 1.5.9

It looks like your topic is missing some important information. Could you provide the following if applicable.

  • n8n version:
  • Database (default: SQLite):
  • n8n EXECUTIONS_PROCESS setting (default: own, main):
  • Running n8n via (Docker, npm, n8n cloud, desktop app):
  • Operating system:

Hi @Heino

Are you using the HTTP Request Node or Youtube Node for this?

Can you check if the response contains a nextPageToken or similar?
You can then make use of the pagination option

Hi Ria!
i use the HTTP Request Node and yes…in the answer is a
“nextPageToken”. :wink: ill check your link
thank you from Berlin

1 Like

God morning :wink: Ria!
I am one step further
https://www.googleapis.com/youtube/v3/search?key=_my_API_key&channelId=_the_channel_ID_&part=snippet,id&order=date&maxResults=50 plus the “PageToken=CGQQAA”

then I get this result:
“kind”: “youtube#playlistListResponse”,
“etag”: “"XpPGQX4Qk/R3A6jpxuE"”,
“nextPageToken”: “CGQQAA”,
“prevPageToken”: “CDIQAQ”,
“pageInfo”: {
“totalResults”: 585,
“resultsPerPage”: 50

I would have to create a loop in the query and add the received nextPageToken as pageToken until nextPageToken is empty?

best from Berlin. //Heino

Hi @Heino

You can use the Pagination option for this in the HTTP request node.
Here’s an example:

1 Like

unfortunately, the youtube api is limited. the limit is quickly reached with many videos. (every page of the query counts).

Actually, the goal is to save the subtitles of all youtube videos on a website and make them available later for a (local) Ki.
So

  1. find all subpages of the website
  2. find all youtube videos
  3. save all subtitles together with the video meta data in a database.

The way via the Youtube channel was supposed to be a shortcut, but unfortunately it was a dead end.

@ria many thanks for your effort

1 Like