Radical bubbles on YouTube? Revisiting algorithmic extremism with personalised recommendations

Authors

  • Mark Ledwich
  • Anna Zaitsev University of California, Berkeley, School of Information
  • Anton Laukemper

DOI:

https://doi.org/10.5210/fm.v27i12.12552

Keywords:

YouTube, recommendation algorithm, radicalisation, personalisation

Abstract

Radicalisation via algorithmic recommendations on social media is an ongoing concern. Our prior study, Ledwich and Zaitsev (2020), investigated the flow of recommendations presented to anonymous control users with no prior watch history. This study extends our work on the behaviour of the YouTube recommendation algorithm by introducing personalised recommendations via personas: bots with content preferences and watch history. We have extended our prior dataset to include several thousand YouTube channels via a machine learning algorithm used to identify and classify channel data. Each persona was first shown content that corresponded with their preference. A set of YouTube content was then shown to each persona. The study reveals that YouTube generates moderate filter bubbles for most personas. However, the filter bubble effect is weak for personas who engaged in niche content, such as Conspiracy and QAnon channels. Surprisingly, all political personas, excluding the mainstream media persona, are recommended less videos from the mainstream media content category than an anonymous viewer with no personalisation. The study also shows that personalization has a larger influence on the home page rather than the videos recommended in the Up Next recommendations feed.

Downloads

Published

2022-12-13

How to Cite

Ledwich, M. ., Zaitsev, A., & Laukemper, A. (2022). Radical bubbles on YouTube? Revisiting algorithmic extremism with personalised recommendations. First Monday, 27(12). https://doi.org/10.5210/fm.v27i12.12552