radicalization

What is the relationship between algorithmic personalization and online radicalization?

From YouTube functioning as “the great radicalizer” (Tufekci, 2018) to the spread of misinformation on Facebook and Twitter (Allcott, Gentzkow, & Yu, 2019), social media algorithms that personalize user content pose a significant concern for American national security. What users click, like, share, read, watch, and comment on is used by social media platforms as input for their proprietary algorithms that determine future content users see. With those inputs, algorithmic personalization can intensify “reinforcing spirals” (Slater, 2007) in which a user’s media choices affect their interests which, in turn, affect their future media choices and so on, producing a negative feedback loop that can lead toward extremism. Studying the interplay between the technological features of personalization algorithms and the psychological attributes and interpretive strategies of users can help us understand how individuals become radicalized online. Therefore, it is vitally important to identify technological, psychological, and cultural factors that lead to the radicalization of vulnerable populations on social media through the algorithmic personalization of content.

We define radicalization broadly to capture not only forms of violent religious extremism but also the adoption of extreme political, social, and cultural beliefs that deviate greatly from moderate or “mainstream” views. Our conceptualization is consistent with previous scholarship that defines radicalization as a relative concept consisting of a set of diverse processes (Borum, 2011; Sedgwick, 2010). Similarly, we define extremist content as that which differs significantly from someone’s current/previous views or from perceived “mainstream” views on an issue. Using these definitions, we believe users can view extreme content or be radicalized on a number of different issues and topics. Our previous work (Le et al., 2019) has shown that algorithmic personalization is higher for populations with extreme ideological beliefs, and there is mounting anecdotal evidence that individuals who feel disenfranchised are particularly vulnerable to becoming radicalized by extremist content on social media (Mak, 2018; Roose, 2019; Turkewitz & Roose, 2018). Therefore, there is an urgent need to identify factors that make one vulnerable to online radicalization and to develop and validate methodologies that can predict the likelihood of becoming radicalized online.

Our research design is innovative, in part, because it offers a mixed methods approach to studying the relationship between algorithmic personalization and online radicalization. Our research team employs qualitative, quantitative, and computational methodologies in the form of a longitudinal survey, behavioral data tracking, in-depth interviews, and sock-puppet auditing to discover the factors that lead to radicalization.

This research is funded by a 3-year grant from the Minerva Research Initiative.


researchers


research output

Below is a list of publications from our research on radicalization.

  1. youtube.png.jpg
    Paying Attention to the Algorithm Behind the Curtain: Bringing Transparency to YouTube’s Demonetization Algorithms
    Arun Dunna, Katherine Keith, Ethan Zuckerman, Narseo Vallina-Rodriguez, Brendan O’Connor, and Rishab Nithyanand
    ACM Conference on Computer Supported Cooperative Work (CSCW), 2022
  2. misogyny.png.jpg
    Pathways to Radical Misogyny: How Participation, Interaction, and Perception in Online Communities Increase Radical Behavior
    Hussam Habib, Padmini Srinivasan, and Rishab Nithyanand
    ACM Conference on Computer Supported Cooperative Work (CSCW), 2022
  3. media_reddit.png.jpg
    Exploring the Magnitude and Effects of Media Influence on Reddit
    Hussam Habib, and Rishab Nithyanand
    International AAAI Conference on Web and Social Media (ICWSM), 2022
  4. reddit_interventions.png.jpg
    Are Proactive Interventions for Reddit Communities Feasible?
    Hussam Habib, Maaz Bin Musa, Fareed Zaffar, and Rishab Nithyanand
    International AAAI Conference on Web and Social Media (ICWSM), 2022
  5. vaccines.png.jpg
    Relationships Among Vaccination Attitudes, Social Media Use, and Activist vs. Radical Behavior
    Ryan Stoldt, Andrew High, Ashley Peterson, Kathryn Biddle, Raven Maragh-Lloyd, Rishab Nithyanand, Brian Ekdale, Timothy Havens, Hussam Habib, and John Thiede
    72nd Annual International Communication Association Conference (ICA), 2022