Back How to prevent radicalization at sites like YouTube? Changing recommendations

How to prevent radicalization at sites like YouTube? Changing recommendations

A study by researchers of UPF and Eurecat to prevent recommendations of extreme content receives the award for best publication at The Web Conference, the main international conference on WWW and one of the leading ones for data analysis.

23.05.2022

Imatge inicial

Platform recommendation algorithms are tending to recommend increasingly radical content. UPF and Eurecat researchers have created a system that prevents the recommendation route from leading to more extreme contents. The study was published in the proceeds of the World Wide Web Conference 2022 and was awarded the prize for best conference publication.

“The tendency by algorithms to recommend increasingly radical and extreme content was already known about; what we propose here is a fair way to prevent such content from being so quickly ‘discovered’”, explains Francesco Fabbri, a UPF doctoral researcher at the EURECAT Technology Centre of Catalonia, whose work is overseen by the Web Science and Social Computing research group at the Department of Information and Communication Technologies (DTIC), directed by the researcher Carlos Castillo, and also by Francesco Bonchi of the Big Data and Data Science group at EURECAT. The publication also involved researchers from the University of Helsinki (Finland) and East China University.

The incentive behind the study is a research result published in 2020 in which researchers from the Polytechnic School of Lausanne (Switzerland), the University of Minas Gerais (Brazil) and Harvard University (USA), studied a database with over 330,000 YouTube videos and found that the recommendations made by the platform’s algorithm tend to lead visitors towards increasingly radical content.

“Search systems exaggerate the differences that exist beforehand, i.e., new biases are not invented but if not controlled, there is a risk of getting trapped in a loop of radical content. Such content, if the user could choose from a menu of different options, would not be selected”

“Search systems exaggerate the differences that exist beforehand, i.e., new biases are not invented but if not controlled, there is a risk of getting trapped in a loop of radical content”, Castillo explains. “Such content, if the user could choose from a menu of different options, would not be selected".

“In this paper, we show how representing sequences of recommendations as a probabilistic network of content may help characterize and mitigate the bias perpetuated by the algorithm”, Fabbri clarifies, “what we show is how to prevent such radicalization without censoring the content”.

Hence the researchers followed the route of contents recommended by the algorithm and made timely interventions to divert the route by one node.

These small changes allowed the researchers to achieve an “end of the road” that does not contain the most radical content. “What we are proposing is that the platform should not take you to such content differently from how it would with another”.

And why does this happen? Why does the algorithm behave like that? “It is very difficult for an algorithm to know whether content will or will not be liked”, Fabbri explains. “No recommendation system is perfect and recommendations are based on a series of previous weak signals by the users, such as the time taken by a user to follow a recommendation. The cumulative effect of recommendations takes care of the rest”. For example, en masse use by extremist users makes the algorithm work like that. The concentration of the actions by this kind of user towards this content also leads the algorithm to see that people move in these circles, stay for longer, and the algorithm “thinks” that this content is more suitable and recommends it.

Hence our proposal to “rewire” recommendations might be highly useful to avoid user radicalization, not to lead people towards such extremist content”, Fabbri concludes. The idea is not to censor these contents, rather to prevent the algorithms from giving them unwarranted, exaggerated visibility, which is what tends to happen.

This publication was awarded the prize for best conference article, which is most meritorious given that it is a paper led by a student. “In addition, the conference is highly selective, Castillo explains, as over 1,800 presentations were submitted this year, of which just over 300 were accepted”.

Multimedia

Categories:

SDG - Sustainable Development Goals:

09. Industry, innovation and infrastructure
12. Responsible consumption and production
Els ODS a la UPF

Contact

For more information

News published by:

Communication Office