YouTube radical content recommendations: a meta-analysis
DOI:
https://doi.org/10.26577/HJ.2023.v67.i1.07Abstract
YouTube's recommendation system algorithms have been the subject of controversy and debate, as there have been incidents of the recommendation system offering hazardous, most often extremist content to users. The purpose of this study is to test the hypothesis that YouTube recommendations can contain and promote radical content.
The scientific and practical significance of this work is that YouTube has a huge popularity, and knowledge about the algorithms of its recommendation system would help to improve information literacy of users and thus prevent the potential danger that can be caused by exposure to malicious information. This study seeks to popularize the idea that all information on the Internet requires a thorough verification. Improving information literacy of users, this research contributes globally to the prevention of terrorist attacks, acts of self-mutilation, suicide, pedophilia, etc.
The sources for the present investigation with description at least one type of radical content were published in the last 5 years. The material of interest was searched through authoritative scientific and metric databases, namely: Google Scholar, Scopus, Web of Science, and PubMed by extracting suitable records. A total of 22 studies were pooled according to the eligibility and exclusion criteria, and a meta-analysis was performed which found that in 13 of them YouTube recommendations contained and promoted hazardous content, in 7 studies the scientists presented ambiguous results, and only in 2 studies the authors found that the recommendations did not contain or promote any radical content.
Thus, according to the results of this research, it was found that YouTube recommendations can contain and disseminate prohibited hazardous content, and therefore the authors strongly recommend making adjustments to the algorithms of the recommendations in order to protect users from prohibited information, including the promotion of extremism and violence.
Keywords: YouTube, recommendations, recommendation system, hazardous content, pseudoscientific content, radical content, extremism, pedophilia, meta-analysis.
References
2. Abul-Fottouh, D., Song, M.Y., & Gruzd, A. (2020). Examining algorithmic biases in YouTube’s recommendations of vaccine videos. International Journal of Medical Informatics, 140: 104175. https://doi.org/10.1016/j.ijmedinf.2020.104175
3. Alfano, M., Fard, A.E., Carter, J.A., Clutton, P., & Klein, C. (2021). Technologically scaffolded atypical cognition: The case of YouTube’s recommender system. // Synthеse, 199 (1): 835-858.
4. Allcott, H., Braghieri, L., Eichmeyer, S., & Gentzkow, M. (2020). The welfare effects of social media. American Economic Review, 110 (3): 629-676. https://doi.org/10.1257/aer.20190658
5. AVAAZ (2020). Why is YouTube broadcasting climate misinformation to millions? [Report]. Avaaz. https://secure.avaaz.org/campaign/en/youtube_climate_misinformation/
6. Chen, A., Nyhan, B., Reifler, J., Robertson, R., & Wilson, C. (2022). Exposure to Alternative & Extremist Content on YouTube [Report]. Anti-Defamation League. https://www.adl.org/resources/reports/exposure-to-alternative-extremist-content-on-youtube
7. Courtois, C., & Timmermans, E. (2018). Cracking the Tinder Code: An Experience Sampling Approach to the Dynamics and Impact of Platform Governing Algorithms. Journal of Computer Mediated Communication, 23 (1): 1-16. https://doi.org/10.1093/jcmc/zmx001
8. Faddoul, M., Chaslot, G., & Farid, H. (2020). A longitudinal analysis of YouTube’s promotion of conspiracy videos. ArXiv preprint arXiv: 2003.03318. https://doi.org/10.48550/ARXiv.2003.03318
9. Fano A. N. et al. (2022). Evaluation of YouTube as a Source of Information Regarding Syndactyly. Pediatrics, 149 (1): 779.
10. Green, S.J. (2019). God told me he was a lizard’: Seattle man accused of killing his brother with a sword. The Seattle Times. https://www.seattletimes.com/seattle-news/crime/god-told-me-he-was-alizard-seattle-man-accused-of-killing-his-brother-with-a-sword/
11. Hosseinmardi, H. et al. (2020). Evaluating the scale, growth, and origins of right-wing echo chambers on YouTube. ArXiv preprint arXiv: 2011.12843. https://doi.org/10.48550/arXiv.2011.12843
12. Hussein, E., Juneja, P., & Mitra, T. (2020). Measuring Misinformation in Video Search Platforms: An Audit Study on YouTube. Proceedings of the ACM on Human-Computer Interaction, 4 (CSCW1): 1-27. https://doi.org/10.1145/3392854
13. Kaakinen, M., Oksanen, A., & Räsänen, P. (2018). Did the risk of exposure to online hate increase after the November 2015 Paris attacks? A group relations approach. Computers in Human Behavior, 78, 90-97. https://doi.org/10.1016/j.chb.2017.09.022
14. Kaiser, J., & Rauchfleisch A. (2020). Birds of a feather get recommended together: Algorithmic homophily in YouTube’s channel recommendations in the United States and Germany. Social Media + Society, 6 (4): 2056305120969914. https://doi.org/10.1177/2056305120969914
15. Kaiser, J., & Rauchfleisch, A. (2019). The implications of venturing down the rabbit hole. Internet Policy Review, 8 (2): 1-22.
16. Kaiser, J., Rauchfleisch, A., & Cordova, Y. (2021). Comparative Approaches to Mis / Disinformation| Fighting Zika with Honey: An Analysis of YouTube’s Video Recommendations on Brazilian YouTube. International Journal of Communication, 15: 1244-1262.
17. Касымбекова, Н.М., Шынгысова, Н.T. (2022). Роль социальных сетей в формировании общественного мнения. Вестник КазНу им. Аль-Фараби. Серия Журналистики, 3 (65): 69. https://doi.org/10.26577/HJ.2022.v65.i3.07
18. Ledwich, M., & Zaitsev, A. (2019). Algorithmic extremism: Examining YouTube’s rabbit hole of radicalization. ArXiv preprint arXiv: 1912.11211. https://doi.org/10.48550/arXiv.1912.11211
19. Ledwich, M., Zaitsev, A., & Laukemper, A. (2022). Radical bubbles on YouTube? Revisiting algorithmic extremism with personalised recommendations. First Monday, 27 (12). https://doi.org/10.5210/fm.v27i12.12552
20. Müller, K., & Schwarz, C. (2021). Fanning the Flames of Hate: Social Media and Hate Crime. Journal of the European Economic Association, 19 (4): 2131–2167. https://doi.org/10.1093/jeea/jvaa045
21. Munger, K., & Phillips, J. (2022). Right-wing YouTube: A supply and demand perspective. The International Journal of Press / Politics, 27 (1): 186-219. https://doi.org/10.1177/194016122096476
22. Nickles, M.A., Rustad, A.M., Ogbuefi, N., McKenney, J.E., & Stout, M. (2022). What's being recommended to patients on social media? A cross-sectional analysis of acne treatments on YouTube. Journal of the American Academy of Dermatology, 86 (4): 920-923. https://doi.org/10.1016/j.jaad.2021.03.053
23. Nienierza, A., Reinemann, C., Fawzi, N., Riesmeyer, C., & Neumann, K. (2021). Too dark to see? Explaining adolescents’ contact with online extremism and their ability to recognize it. Information, Communication & Society, 24 (9): 1229-1246. https://doi.org/10.1080/1369118X.2019.1697339
24. Papadamou, K. et al. (2020). Disturbed YouTube for kids: Characterizing and detecting inappropriate videos targeting young children. Proceedings of the International AAAI Conference on Web and Social Media, 14: 522-533. https://doi.org/10.1609/icwsm.v14i1.7320
25. Papadamou, K. et al. (2021). “How over is it?” Understanding the Incel Community on YouTube. Proceedings of the ACM on Human-Computer Interaction, 5 (CSCW2): 1-25. https://doi.org/10.1145/3479556
26. Papadamou, K. et al. (2022). “It is just a flu”: Assessing the Effect of Watch History on YouTube’s Pseudoscientific Video Recommendations. Proceedings of the International AAAI Conference on Web and Social Media, 16: 723-734. https://doi.org/10.1609/icwsm.v16i1.19329
27. Ribeiro, M.H., Ottoni, R., West, R., Almeida, V.A.F., & Meira, W. (2020). Auditing Radicalization Pathways on YouTube. Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency: 131-141. https://doi.org/10.1145/3351095.3372879
28. Röchert, D., Weitzel, M., & Ross, B. (2020). The homogeneity of right-wing populist and radical content in YouTube recommendations. International Conference on Social Media and Society: 245-254. https://doi.org/10.1145/3400806.3400835
29. Roose, K. (2019). The making of a YouTube radical. The New York Times, 8. https://rhet104.commacafe.org/wp-content/uploads/2021/05/Making-of-a-YouTube-Radical.pdf
30. Schaub, M., & Morisi, D. (2020). Voter mobilisation in the echo chamber: Broadband internet and the rise of populism in Europe. European Journal of Political Research, 59 (4): 752-773. https://doi.org/10.1111/1475-6765.12373
31. Schmitt, J.B., Rieger, D., Rutkowski, O., & Ernst, J. (2018). Counter-messages as prevention or promotion of extremism?! The potential role of YouTube: Recommendation algorithms. Journal of Communication, 68 (4): 780-808. https://doi.org/10.1093/joc/jqy029
32. Spinelli, L., & Crovella, M. (2020). How YouTube leads privacy-seeking users away from reliable information. Adjunct publication of the 28th ACM conference on user modeling, adaptation and personalization: 244-251. https://doi.org/10.1145/3386392.3399566
33. Stöcker, C., & Preuss, M. (2020). Riding the wave of misclassification: how we end up with extreme YouTube content. International Conference on Human-Computer Interaction: 359-375. https://doi.org/10.1007/978-3-030-49570-1_25
34. Williams, M.L., Burnap, P., Javed, A., Liu, H., & Ozalp, S. (2020). Hate in the machine: Anti-Black and anti-Muslim social media posts as predictors of offline racially and religiously aggravated crime. The British Journal of Criminology, 60 (1): 93-117. https://doi.org/10.1093/bjc/azz049
35. Есенбекова, Ұ.М., Алдабергенова, Ж.Ж., Маманкул, А.А., Смаилова, Б.А., Толегенова, С.Т. (2022). Әлеуметтік медианың жастар белсенділігіне әсері. Вестник КазНу им. Аль-Фараби. Серия Журналистики, 63 (1): 53. https://doi.org/10.26577/HJ.2022.v63.i1.06
References
1. Abdrashev, R.М. (2016). Protivodejstvie internet-propagande jekstremizma v Respublike Kazahstan [Counteraction to extremism propaganda on the Internet in the Republic of Kazakhstan]. Bulletin of the Siberian Law Institute of the Ministry of Internal Affairs of Russia, vol. 1, no 22, pp. 58–62.
2. Abul-Fottouh, D., Song, M.Y., & Gruzd, A. (2020). Examining algorithmic biases in YouTube’s recommendations of vaccine videos. International Journal of Medical Informatics, 140: 104175. https://doi.org/10.1016/j.ijmedinf.2020.104175
3. Alfano, M., Fard, A.E., Carter, J.A., Clutton, P., & Klein, C. (2021). Technologically scaffolded atypical cognition: The case of YouTube’s recommender system. // Synthеse, 199 (1): 835-858.
4. Allcott, H., Braghieri, L., Eichmeyer, S., & Gentzkow, M. (2020). The welfare effects of social media. American Economic Review, 110 (3): 629-676. https://doi.org/10.1257/aer.20190658
5. AVAAZ (2020). Why is YouTube broadcasting climate misinformation to millions? [Report]. Avaaz. https://secure.avaaz.org/campaign/en/youtube_climate_misinformation/
6. Chen, A., Nyhan, B., Reifler, J., Robertson, R., & Wilson, C. (2022). Exposure to Alternative & Extremist Content on YouTube [Report]. Anti-Defamation League. https://www.adl.org/resources/reports/exposure-to-alternative-extremist-content-on-youtube
7. Courtois, C., & Timmermans, E. (2018). Cracking the Tinder Code: An Experience Sampling Approach to the Dynamics and Impact of Platform Governing Algorithms. Journal of Computer Mediated Communication, 23 (1): 1-16. https://doi.org/10.1093/jcmc/zmx001
8. Faddoul, M., Chaslot, G., & Farid, H. (2020). A longitudinal analysis of YouTube’s promotion of conspiracy videos. ArXiv preprint arXiv: 2003.03318. https://doi.org/10.48550/ARXiv.2003.03318
9. Fano A. N. et al. (2022). Evaluation of YouTube as a Source of Information Regarding Syndactyly. Pediatrics, 149 (1): 779.
10. Green, S.J. (2019). God told me he was a lizard’: Seattle man accused of killing his brother with a sword. The Seattle Times. https://www.seattletimes.com/seattle-news/crime/god-told-me-he-was-alizard-seattle-man-accused-of-killing-his-brother-with-a-sword/
11. Hosseinmardi, H. et al. (2020). Evaluating the scale, growth, and origins of right-wing echo chambers on YouTube. ArXiv preprint arXiv: 2011.12843. https://doi.org/10.48550/arXiv.2011.12843
12. Hussein, E., Juneja, P., & Mitra, T. (2020). Measuring Misinformation in Video Search Platforms: An Audit Study on YouTube. Proceedings of the ACM on Human-Computer Interaction, 4 (CSCW1): 1-27. https://doi.org/10.1145/3392854
13. Kaakinen, M., Oksanen, A., & Räsänen, P. (2018). Did the risk of exposure to online hate increase after the November 2015 Paris attacks? A group relations approach. Computers in Human Behavior, 78, 90-97. https://doi.org/10.1016/j.chb.2017.09.022
14. Kaiser, J., & Rauchfleisch A. (2020). Birds of a feather get recommended together: Algorithmic homophily in YouTube’s channel recommendations in the United States and Germany. Social Media + Society, 6 (4): 2056305120969914. https://doi.org/10.1177/2056305120969914
15. Kaiser, J., & Rauchfleisch, A. (2019). The implications of venturing down the rabbit hole. Internet Policy Review, 8 (2): 1-22.
16. Kaiser, J., Rauchfleisch, A., & Cordova, Y. (2021). Comparative Approaches to Mis / Disinformation| Fighting Zika with Honey: An Analysis of YouTube’s Video Recommendations on Brazilian YouTube. International Journal of Communication, 15: 1244-1262.
17. Kassymbekova, N.M. & Shyngyssova, N.T. (2022). Rol' social'nyh setej v formirovanii obshhestvennogo mnenija [The role of social networks in public opinion shaping]. Herald Of Journalism, vol. 3, no 65, p. 69. https://doi.org/10.26577/HJ.2022.v65.i3.07
18. Ledwich, M., & Zaitsev, A. (2019). Algorithmic extremism: Examining YouTube’s rabbit hole of radicalization. ArXiv preprint arXiv: 1912.11211. https://doi.org/10.48550/arXiv.1912.11211
19. Ledwich, M., Zaitsev, A., & Laukemper, A. (2022). Radical bubbles on YouTube? Revisiting algorithmic extremism with personalised recommendations. First Monday, 27 (12). https://doi.org/10.5210/fm.v27i12.12552
20. Müller, K., & Schwarz, C. (2021). Fanning the Flames of Hate: Social Media and Hate Crime. Journal of the European Economic Association, 19 (4): 2131–2167. https://doi.org/10.1093/jeea/jvaa045
21. Munger, K., & Phillips, J. (2022). Right-wing YouTube: A supply and demand perspective. The International Journal of Press / Politics, 27 (1): 186-219. https://doi.org/10.1177/194016122096476
22. Nickles, M.A., Rustad, A.M., Ogbuefi, N., McKenney, J.E., & Stout, M. (2022). What's being recommended to patients on social media? A cross-sectional analysis of acne treatments on YouTube. Journal of the American Academy of Dermatology, 86 (4): 920-923. https://doi.org/10.1016/j.jaad.2021.03.053
23. Nienierza, A., Reinemann, C., Fawzi, N., Riesmeyer, C., & Neumann, K. (2021). Too dark to see? Explaining adolescents’ contact with online extremism and their ability to recognize it. Information, Communication & Society, 24 (9): 1229-1246. https://doi.org/10.1080/1369118X.2019.1697339
24. Papadamou, K. et al. (2020). Disturbed YouTube for kids: Characterizing and detecting inappropriate videos targeting young children. Proceedings of the International AAAI Conference on Web and Social Media, 14: 522-533. https://doi.org/10.1609/icwsm.v14i1.7320
25. Papadamou, K. et al. (2021). “How over is it?” Understanding the Incel Community on YouTube. Proceedings of the ACM on Human-Computer Interaction, 5 (CSCW2): 1-25. https://doi.org/10.1145/3479556
26. Papadamou, K. et al. (2022). “It is just a flu”: Assessing the Effect of Watch History on YouTube’s Pseudoscientific Video Recommendations. Proceedings of the International AAAI Conference on Web and Social Media, 16: 723-734. https://doi.org/10.1609/icwsm.v16i1.19329
27. Ribeiro, M.H., Ottoni, R., West, R., Almeida, V.A.F., & Meira, W. (2020). Auditing Radicalization Pathways on YouTube. Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency: 131-141. https://doi.org/10.1145/3351095.3372879
28. Röchert, D., Weitzel, M., & Ross, B. (2020). The homogeneity of right-wing populist and radical content in YouTube recommendations. International Conference on Social Media and Society: 245-254. https://doi.org/10.1145/3400806.3400835
29. Roose, K. (2019). The making of a YouTube radical. The New York Times, 8. https://rhet104.commacafe.org/wp-content/uploads/2021/05/Making-of-a-YouTube-Radical.pdf
30. Schaub, M., & Morisi, D. (2020). Voter mobilisation in the echo chamber: Broadband internet and the rise of populism in Europe. European Journal of Political Research, 59 (4): 752-773. https://doi.org/10.1111/1475-6765.12373
31. Schmitt, J.B., Rieger, D., Rutkowski, O., & Ernst, J. (2018). Counter-messages as prevention or promotion of extremism?! The potential role of YouTube: Recommendation algorithms. Journal of Communication, 68 (4): 780-808. https://doi.org/10.1093/joc/jqy029
32. Spinelli, L., & Crovella, M. (2020). How YouTube leads privacy-seeking users away from reliable information. Adjunct publication of the 28th ACM conference on user modeling, adaptation and personalization: 244-251. https://doi.org/10.1145/3386392.3399566
33. Stöcker, C., & Preuss, M. (2020). Riding the wave of misclassification: how we end up with extreme YouTube content. International Conference on Human-Computer Interaction: 359-375. https://doi.org/10.1007/978-3-030-49570-1_25
34. Williams, M.L., Burnap, P., Javed, A., Liu, H., & Ozalp, S. (2020). Hate in the machine: Anti-Black and anti-Muslim social media posts as predictors of offline racially and religiously aggravated crime. The British Journal of Criminology, 60 (1): 93-117. https://doi.org/10.1093/bjc/azz049
35. Yessenbekova, U., Aldabergenova, Z., Mamankul, A., Smailova, B., & Tolegenova, S. (2022). Äleumettik medianyñ jastar belsendiligine äseri [The impact of social networks on youth activity]. Herald Of Journalism, vol. 63, no 1, p. 53. https://doi.org/10.26577/HJ.2022.v63.i1.06