Alt-right pipeline

Graphic of interactions between mostly right-wing personalities on YouTube from January 2017 to April 2018. Each line indicates a shared appearance in a YouTube video, allowing audiences of one personality to discover another.[1]

The alt-right pipeline (also called the alt-right rabbit hole) is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics. It posits that this interaction takes place due to the interconnected nature of political commentators and online communities, allowing members of one audience or community to discover more extreme groups.[1][2] This process is most commonly associated with and has been documented on the video platform YouTube, and is largely faceted by the method in which algorithms on various social media platforms function through the process recommending content that is similar to what users engage with, but can quickly lead users down rabbit-holes.[2][3][4]

Many political movements have been associated with the pipeline concept. The intellectual dark web,[2] libertarianism,[5] the men's rights movement,[6] and the alt-lite movement[2] have all been identified as possibly introducing audiences to alt-right ideas. Audiences that seek out and are willing to accept extreme content in this fashion typically consist of young men, commonly those that experience significant loneliness and seek belonging or meaning.[7] In an attempt to find community and belonging, message boards that are often proliferated with hard right social commentary, such as 4chan and 8chan, have been well documented in their importance in the radicalization process.[8]

The alt-right pipeline may be a contributing factor to domestic terrorism.[9][10] Many social media platforms have acknowledged this path of radicalization and have taken measures to prevent it, including the removal of extremist figures and rules against hate speech and misinformation.[3][7] Left-wing movements, such as BreadTube, also oppose the alt-right pipeline and "seek to create a 'leftist pipeline' as a counterforce to the alt-right pipeline."[11]

The effects of YouTube's algorithmic bias in radicalizing users has been replicated by one study,[2][12][13][14] although two other studies found little or no evidence of a radicalization process.[3][15][16]

  1. ^ a b Cite error: The named reference Lewis 2018 was invoked but never defined (see the help page).
  2. ^ a b c d e Ribeiro, Manoel Horta; Ottoni, Raphael; West, Robert; Almeida, Virgílio A. F.; Meira, Wagner (27 January 2020). "Auditing radicalization pathways on YouTube". Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. pp. 131–141. doi:10.1145/3351095.3372879. ISBN 9781450369367. S2CID 201316434.
  3. ^ a b c Ledwich, Mark; Zaitsev, Anna (26 February 2020). "Algorithmic extremism: Examining YouTube's rabbit hole of radicalization". First Monday. arXiv:1912.11211. doi:10.5210/fm.v25i3.10419. ISSN 1396-0466. S2CID 209460683. Archived from the original on 28 October 2022. Retrieved 28 October 2022.
  4. ^ "Mozilla Investigation: YouTube Algorithm Recommends Videos that Violate the Platform's Very Own Policies". Mozilla Foundation. 7 July 2021. Archived from the original on 25 March 2023. Retrieved 25 March 2023.
  5. ^ Cite error: The named reference Hermansson 2020 was invoked but never defined (see the help page).
  6. ^ Cite error: The named reference Mamié 2021 was invoked but never defined (see the help page).
  7. ^ a b Cite error: The named reference Roose 2019 was invoked but never defined (see the help page).
  8. ^ Hughes, Terwyn (26 January 2021). "Canada's alt-right pipeline". The Pigeon. Archived from the original on 25 March 2023. Retrieved 25 March 2023.
  9. ^ Piazza, James A. (2 January 2022). "Fake news: the effects of social media disinformation on domestic terrorism". Dynamics of Asymmetric Conflict. 15 (1): 55–77. doi:10.1080/17467586.2021.1895263. ISSN 1746-7586. S2CID 233679934. Archived from the original on 25 July 2023. Retrieved 4 November 2022.
  10. ^ Cite error: The named reference Munn 2019 was invoked but never defined (see the help page).
  11. ^ Cite error: The named reference Cotter 2022 was invoked but never defined (see the help page).
  12. ^ Lomas, Natasha (28 January 2020). "Study of YouTube comments finds evidence of radicalization effect". TechCrunch. Retrieved 17 July 2021.
  13. ^ Newton, Casey (28 August 2019). "YouTube may push users to more radical views over time, a new paper argues". The Verge. Archived from the original on 27 July 2023. Retrieved 17 July 2021.
  14. ^ Ribeiro, Manoel Horta; Ottoni, Raphael; West, Robert; Almeida, Virgílio A. F.; Meira, Wagner (22 August 2019). "Auditing Radicalization Pathways on YouTube". arXiv:1908.08313 [cs.CY].
  15. ^ Hosseinmardi, Homa; Ghasemian, Amir; Clauset, Aaron; Mobius, Markus; Rothschild, David M.; Watts, Duncan J. (2 August 2021). "Examining the consumption of radical content on You Tube". Proceedings of the National Academy of Sciences. 118 (32). arXiv:2011.12843. Bibcode:2021PNAS..11801967H. doi:10.1073/pnas.2101967118. PMC 8364190. PMID 34341121.
  16. ^ * Chen, Annie Y.; Nyhan, Brendan; Reifler, Jason; Robertson, Ronald E.; Wilson, Christo (22 April 2022). "Subscriptions and external links help drive resentful users to alternative and extremist YouTube videos". arXiv:2204.10921 [cs.SI].

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search