As the second most popular social media platform in the world, YouTube frequently attracts criticism. In particular, critics argue that its algorithmic recommendations facilitate radicalization and extremism by sending users down “rabbit holes” of harmful content.
According to a new study published in Science Advances, however, exposure to alternative and extremist video channels on YouTube is not driven by recommendations. Instead, most consumption of these channels on the platform can be attributed to a small group of users high in gender and racial resentment and who subscribe to these channels and follow links to their videos.
The study authors caution that these findings do not exonerate the platform. “YouTube’s algorithms may not be recommending alternative and extremist content to nonsubscribers very often, but they are nonetheless hosting it for free and funneling it to subscribers in ways that are of great concern,” says co-author Brendan Nyhan, the James O. Freedman Presidential Professor at Dartmouth.
“The problem of potentially harmful content on YouTube is real,” Nyhan adds. “The challenge is understanding the nature of the problems, so we can think about how best to address it.”
In 2019, YouTube announced that changes to its algorithms had reduced watch time of harmful content by 50%, with a 70% decline in watch time by nonsubscribers. These reports had not been independently verified, so the research team set out to determine who is watching this type of content and evaluate what recommendations are offered by YouTube’s algorithm.
The research team analyzed more than 1,100 participants’ web browsing data. Participants were recruited from a general population sample of 2,000 people; a group of 1,000 people who had previously expressed high levels of racism and hostile sexism in another survey; and a sample of 1,000 people with high levels of self-reported YouTube use.
All participants who opted in provided informed consent to allow anonymized tracking of their web browsing behavior in Chrome or Firefox from July to December 2020, with various security protocols in place.
Given the challenges of trying to characterize the content of every single video viewed, the researchers focused on the type of YouTube channels people watched. They compiled lists of channels that had been identified as alternative or extreme by journalists and academics and then examined how often a participant visited videos from those channels.
Alternative channels included content for men’s rights activists, anti-social justice warriors, and intellectual dark web material; extreme channels included white supremacist, alt-right, and extremist material.
The results showed that exposure to alternative and extremist channels was quite rare among the study groups. Only 15% of people who opted to provide daily browser activity data visited an alternative channel video and only 6% viewed an extremist channel video.
A majority of viewers of the potentially harmful channels were subscribers to the type of channel in question: 61% subscribers for alternative channels and 55% for extremist channels. Almost all subscribed either to the channel in question or another one like it: 93% for alternative channels and 85% for extremist channels.
Viewing time data showed that a tiny percentage of people were responsible for most of the time participants spent watching potentially harmful channels. Specifically, 1.7% of participants were responsible for 80% of time spent on alternative channels while only 0.6% of participants were responsible for 80% of the time spent on extremist channels.
The researchers also found that people who scored high in hostile sexism and racial resentment were more likely to visit videos from alternative and extremist channels.
“What really stands out is the correlation between content subscribers’ prior levels of hostile sexism and more time spent watching videos from alternative and extremist channels,” says Nyhan. “We interpret that relationship as suggesting that people are seeking this content out.”
By contrast, the researchers found that recommendations to alternative and extremist channel videos were very rare and that “rabbit hole”-type events were only observed a handful of times during the study period.
The researchers explain that their findings do not speak to what was happening on YouTube prior to the changes made to the website’s algorithm in 2019; recommendations and viewing patterns during that period may have differed substantially from what the researchers observed in 2020.
Prior to publication in the peer-reviewed journal Science Advances, this work was initially published in an Anti-Defamation League report. It gained national attention when U.S. Rep. Anna Eshoo, D-Calif., cited the report in March 2021 during a virtual joint hearing of the House Energy and Commerce Committee, Disinformation Nation: Social Media’s Role in Promoting Extremism and Misinformation. The CEOs of Facebook, Google (which owns YouTube), and Twitter (now X) testified at the hearing, which was widely covered by the media.
Co-authors of the research paper include: Annie Chen at CUNY Institute State & Local Governance; Jason Reifler at University of Exeter, a long-time collaborator with Nyhan; Ronald Robertson at Stanford University; and Christo Wilson at Northeastern University.