YouTube now recommends fewer conspiracies — and less Canadian political content

YouTube's recommendation algorithm seems to be trying to actively steer users away from political content and toward more viral or popular videos, an in-depth CBC analysis of over 20,000 recommendations shows.

Go looking for videos about Canadian politics on YouTube and you'll find an endless supply of content. Searches for "Andrew Scheer" or "New Democratic Party" yield thousands of videos.

But when it comes to YouTube's suggestions — curated videos automatically picked by artificial intelligence and tailored to suit a user's preferences — it's a different story.

While the recommendation algorithm — which has been accused of pushing content based on conspiracies and hate speech in Canada and elsewhere — seems to have cleaned up its act, Canadian political content is scarce in YouTube's recommendations.

CBC used custom software to track which videos are being recommended to users after they've watched a video located by searching for some 40 keywords, 21 of which are related to the Canadian federal election (see methodology below). Such recommendations appear to the right of the video being watched, under the "Up next" label, and are often played automatically once a video is finished.

Only 2.1 per cent of video recommendations on videos located through searches related to Canadian politics were in any way related to the topic. Almost half, or 46.1 per cent, had nothing to do with politics at all.

For example, the most recommended video linked to 21 search terms related to Canadian politics — such as "Canada election" — was a TED Talk by journalist Jon Ronson about psychopaths. It was recommended 300 times.

Even in the wake of Liberal Leader Justin Trudeau's blackface scandal, only a tiny minority of the videos YouTube's algorithm recommended following a search for Canadian political material are actually related to Canadian politics or the scandal itself. In a test conducted in the days after, only 6 of the 936 recommended videos in CBC's analysis mentioned the blackface controversy — despite thousands of videos about this topic having been uploaded to YouTube worldwide.

Patrick Semansky/The Associated Press
Patrick Semansky/The Associated Press

But when you search for videos unrelated to politics — like recipes, pop music or video games — most of the videos recommended by YouTube are directly related to those topics.

A history of troubling content

YouTube has been accused in the past of promoting videos pushing conspiracy theories or extremist views. A former Google engineer told Radio-Canada these videos kept viewers engaged with the website longer and encouraged them to watch more videos, earning YouTube more money through ads.

The social media company announced in January that it would change its recommendation system to reduce harmful content and prioritize what it calls authoritative sources. Since then, tests by CBC show that the recommendation algorithm has changed, and extremist content is less likely to be recommended.

Karen Yeung, a professor of law and computer science at the University of Birmingham in Birmingham, U.K., said it appears YouTube tweaked its recommendation process to avoid being accused of fostering a fragmented political debate in the Canadian election.

But she cautioned it's impossible to be certain that YouTube is diverting viewers away from political content. YouTube does not publicly discuss the details of its algorithms.

And because so many people get their news from YouTube, the lack of relevant recommendations is a form of censorship, Yeung added.

"In doing that, they're actually also impoverishing the debate by not allowing you to access crucial information," she said.

Yeung said YouTube should be more transparent about which videos it's recommending to users by, for example, adding a label to recommended videos that tells viewers why it was brought to their attention.

CBC News Graphics
CBC News Graphics

Anatoliy Gruzd, director of research at Ryerson University's Social Media Lab and an associate professor at the Ted Rogers School of Management, said the results of CBC's test were surprising.

"Many Canadians turn to YouTube for their news and updates," he said, adding that "if somebody started a search with Canadian politics, you would expect that they would be persistent in their recommendations as well."

Gruzd said that YouTube's algorithm is always being tweaked, making it hard to determine why the company is making certain choices — but the company's primary goal is to keep viewers on the platform for as long as possible to expose them to online advertising.

He suggested YouTube might have a shortage of Canadian political content, while its fund of American political content is vast. He said the fact that people are more likely to interact with American content — especially anything related to U.S. President Donald Trump — could explain why non-Canadian political videos seem to be recommended more.

Non-political videos get pertinent recommendations

Guillaume Chaslot is a former Google engineer who spent some of his time there working on YouTube's recommendation algorithm. He said he left Google in 2013 because he was troubled by what he saw as YouTube's unwillingness to prevent its algorithm from exposing users to harmful content. Since then, he's been tracking the platform's video recommendations and calls for more transparency in how social media platforms' algorithms function.

Google has been critical of Chaslot's work analyzing the company's algorithm.

Chaslot said he's seen the effect detected by CBC's analysis while examining videos about the protests in Hong Kong. There, he said, people watching videos about the protests are being steered toward videos that have nothing to do with the current political situation.

"The algorithm's behaviour is really weird on many subjects. It's hard to know for sure what's going on, given that we have so little data on what's going on," Chaslot said. "Globally, we'd need YouTube's own statistics to know for sure. That is data we don't have."

To see if this effect on political videos also extends to YouTube searches on other topics, CBC analyzed recommendations related to categories unrelated to Canadian politics: foreign leaders and international news, conspiracies and relatively uncontroversial topics such as recipes, video games and sports.

The results show that more pertinent videos were recommended for these control group search terms than for political search terms.

These results seem out of step with the results of other, similar experiments done in the past year.

For example, a Radio-Canada investigation of YouTube recommendations linked to the 2017 Quebec mosque shooting showed that conspiracy videos — both those related to the shooting and videos about other conspiracy theories — fared well in recommendations.

A CBC dive into recommendations about the Ontario provincial election last year found that a Canadian conspiracy channel pushing videos about the election had outperformed all other information sources.

CBC News Graphics
CBC News Graphics

A YouTube spokesperson, who would not speak on the record, said that the number of views that "harmful" videos have received through algorithm recommendations has dropped by more than 50 per cent since YouTube announced in January a plan to reduce such content in recommendations. The spokesperson did not say if the change in recommendations observed across the tests from last year and this year was a result of that strategy.

The spokesperson also said that YouTube "strongly disagrees" with the methodology CBC used to harvest this data, which was based on a program written by Chaslot, adding that Chaslot has "limited insight into our recommendation system" given that he left Google before the system was changed.

To conduct the analysis, CBC simulated a user browsing in incognito mode and accessed YouTube like a first-time user — meaning there would be no watch history associated with the account. While the results don't reflect every user's experience, the results do suggest which kinds of videos are being recommended most often.

The YouTube spokesperson objected to this methodology, saying it only reflects what a new account might see and does not take into account other factors that affect recommendations.

"The main reason for this is due to personalization: YouTube recommendations are personalized, while certain research (including Mr. Chaslot's research) which scrapes data logged-out, tend to be run simulating users without any watch history," said the spokesperson.

Chaslot said it's impossible to know for certain if YouTube is pushing irrelevant content in recommendations in order to flush out harmful videos, since so little is known about how its algorithm works.

He adds that YouTube's recommendation algorithm is one of the company's most closely guarded secrets, since it drives some 70 per cent of the views on the site.

Chaslot said he thinks it's in YouTube's interest to show users recommendations that have nothing to do with what they're searching for. "The more time you spend time on YouTube, the more they can show you ads," he said. "So, if you're looking for videos about a subject that won't keep you on the site, YouTube will want to push you towards topics that will make you spend more time on the site".

He added that YouTube is very different from Google, which is optimized to find users the best results as quickly as possible. "Google's algorithm works with you to help you find what you're looking for. YouTube's algorithm works against you. You have to fight against the algorithm."

Methodology

To see which videos YouTube recommends, CBC came up with two lists of search terms, one about Canadian political topics, and another with general topics:

Canadian search terms:

Justin Trudeau, Andrew Scheer, Maxime Bernier, Elizabeth May, Jagmeet Singh, Yves-François Blanchet, Liberal Party of Canada, Conservative Party of Canada, People's Party of Canada, Green Party of Canada, New Democratic Party, Bloc Québécois, Canada election, Canada pipelines, Canada carbon tax, Canada immigration, Canada economy, Canada LGBT, Canada refugees, Canada climate change, Canada election ads, Canada federal election.

General search terms:

Toronto Raptors, Donald Trump, Jair Bolsonaro, Emmanuel Macron, Boris Johnson, Brexit, mass shooting, QAnon, vaccines, crisis actor, pasta recipe, Fortnite, Minecraft, yoga class, Taylor Swift, astrophysics, bike repair, Marvel

CBC used a modified Python script written by former YouTube employee Guillaume Chaslot to fetch data on the videos recommended for those search terms. The searches were done on Sept. 9 and 10, 2019. Another search was done on Sept. 23 after the eruption of the blackface scandal involving Justin Trudeau.

The script automates a search on YouTube for a given term and looks at the first five videos that return. It then looks at the first three recommended videos for each of those videos, and the first three recommended videos for those recommended videos, and so on, going five layers deep.

The script behaves like someone accessing YouTube for the first time in incognito mode. As such, YouTube has no other information — such as past viewing habits — to customize suggestions. This is not how most people use the website, so these results don't reflect every person's experience.

CBC reporters found similar results when manually clicking through recommended videos, however.

CBC reporters went through the videos and categorized them as either directly pertinent, somewhat pertinent, or irrelevant. For example, on a search for "Toronto Raptors," videos about basketball are directly pertinent, videos about other sports are somewhat pertinent and videos about U.S. politics are irrelevant. This was only done for videos that were recommended three or more times for each search term.

The resulting data can be accessed here.