Back Would you like to donate your curriculum to science?: it will be used to investigate and correct the biases of personnel selection applications

Would you like to donate your curriculum to science?: it will be used to investigate and correct the biases of personnel selection applications

The curricula will be used by the FINDHR (Fairness and Intersectional Non-Discrimination in Human Recommendation) project, driven by an international consortium of universities, companies and associations, led by UPF.

21.07.2023

Imatge inicial

A research team, led by UPF, has just launched a call to collect resumes from citizens, needed to conduct research into how to promote fair and non-discriminatory algorithms of the technological systems used in personnel selection processes. The research is part of the European project FINDHR (Fairness and Intersectional Non-Discrimination in Human Recommendation), promoted by an international consortium made up of universities, companies and associations, coordinated by the Web Science and Social Computing (WSSC) research group at the UPF Department of Information and Communication Technologies (DTIC).

The resume can be donated anonymously

This three-year research project was launched at the end of 2022 in the light of the increasing use by companies of automated personnel search and selection systems, which are based on algorithms and artificial intelligence (AI). The implementation of these automated systems may lead to discriminatory treatment on grounds of origin, ethnicity, gender identity, sexual orientation or other social factors.

In order to detect these biases and correct them, the research team needs CVs in different languages, such as Catalan or Spanish, in addition to English. CVs can be donated anonymously, that is, including professional data, but without the name or other information that allows determining the identity of their donor. These real curricula will serve as a basis for preparing fictitious CVs, with which methods can be developed to prevent bias in personnel recommendation applications. The goal of the research team is to collect at least 1,000 resumes by autumn and, so far, 100 donations have been made.

Carlos Castillo, principal investigator of FINDHR (UPF): “There is so much need for quantitative information on discrimination that some organizations have been asking for these data to be collected ethically for years. These data will open new lines of investigation that have been closed until now”

Regarding the need for data collection in this field, the principal investigator of the research and director of the WSSC at UPF, Carlos Castillo, explains: “There is so much need for quantitative information on discrimination that some organizations have been asking for these data to be collected ethically for years. These data will open new lines of investigation that have been closed until now”.

Through this project, it will be possible to study the technical, legal and ethical problems involved in the use of AI technologies in personnel search and selection processes. Thus, solutions will be proposed to manage and minimize the risks of discriminating applications that are used to obtain recommendations related to human beings, especially in the field of personnel selection. These applications may also have other purposes: university admissions, the prioritization of subsidies, scholarships or other public grants or online recommendation systems of service providers, medical professionals, language tutors and freelancers.

Carlos Castillo clarifies that, beyond correcting the technology bias, the research will also take into account all of the factors involved in hiring: “We wish to build less biased artificial intelligence systems, but this is not just a technological, but also legal and ethical challenge linked to a context in which AI applications are increasingly used in personnel selection”.

An interdisciplinary project

The FINDHR international consortium is made up of expert partners in the following areas: legislation and data protection, cross-cultural digital ethics, digital services auditing, technological regulation. NGOs dedicated to combating discrimination against women and the most vulnerable groups of society and workers' representatives in Europe are also collaborating. In addition to UPF, the following are members of the consortium: University of Amsterdam, University of Pisa, Max Planck Society on Privacy and Security in Germany, Algorithm Watch, Switzerland, ÈTIQUES Research and Consulting, Adevinta Spain (InfoJobs),Randstad Netherlands BV, Radboud University, Erasmus University Rotterdam, European Trade Union Confederation, Women in Development in Europe (WIDE+) and PRAKSIS Association. The project is funded by the EU’s Horizon programme and supported by the Swiss State Secretariat for Education, Research and Innovation.

Personnel selection applications, classified as high-risk by the new European AI law

To develop its research, FINDHR’s international consortium has taken as a reference the new European AI law, initially approved by the European Parliament last June. The regulation distinguishes between the uses of AI tools according to their level of risk to society (unacceptable, high, limited or minimal), with regard to security, health or fundamental rights. Under this regulation, systems that make inferences to categorize, classify or recommend people are classified as high-risk applications. In addition, to analyse the risks of AI discrimination, confidential data that are protected by the EU’s General Data Protection Regulation (GDPR) require processing. The FINDHR project will take this regulation into account while making a specific analysis of the tensions between data protection and anti-discrimination regulations in Europe.

To contribute to the CV donation campaign, you can access:
https://findhr.eu/datadonation/

 

Multimedia

SDG - Sustainable Development Goals:

08. Decent work and economic growth
Els ODS a la UPF

Contact

For more information

News published by:

Communication Office