3. Kaleidoscope

Artificial intelligence, an opportunity to mitigate gender biases

min
Emília Gómez

Emília Gómez, lecturer in the Departament of Information and Communication Technologies at UPF

Artificial intelligence (AI) systems can be understood as machines capable of observing their environment and taking actions to achieve a certain goal (Craglia et al., 2018). My research has focused on machine-learning (ML) methods, a subfield of AI in which I have developed systems that observe big musical data (and human annotation of those data) to find patterns or perform classifications.

Today, such methods are applied in various contexts that we use every day, such as Internet search engines, music recommendation systems or mobility and navigation apps. They are also used, and will be increasingly used in future, in professional contexts, such as medical diagnoses or judicial decisions.

As part of the European Commission’s Joint Research Centre’s HUMAINT project, which I lead and which several researchers from the Department of Information and Communication Technologies (DTIC) at UPF collaborate on, we looked at the impact that AI systems have and will have on human behaviour, mainly on our cognitive and socio-emotional capabilities. One of our first conclusions was the need for the teams that develop AI systems to be diverse for these technologies to be meaningful for everyone.

According to Reuters (2017), the percentage of women in technical roles at the main AI companies is only around 20%. The main problem this poses is that when the developers create their systems, they incorporate – often unconsciously – their own biases and preferences in the various stages of development (Tolan, 2018). As a result, AI systems seem to be biased towards male developers.

Let’s look at some real-life practical examples of gender biases in AI systems:

  1. Several studies have found that voice and speech recognition systems perform worse for mowen than for men (Tatman, 2016; Times, 2011; Roger & Pendharkar, 2003; Nicol et al., 2002).

  2. Facial recognition systems have also been found to return more errors with female faces (Buolamwin i Gebru, 2018).

  3. Recruitmentes tools based on text mining can also inherit gender biases from the data used to train them.

  4. Internet search engines, which are widely used by all people, can yield gender biases as well. If we type "work" or "go shopping" into image search engines, we usually find more photos of men for the former and more for women for the latter. That is a reflection of the societal stereotypes present in the data and annotations.

  5. Gender bias is also very important in sensitive applications such as those related to health or criminal justice (Tolan et al., 2018b). Those fields have traditionally had manuals and very strong methodologies to help people take these important and sensitive decisions in a fair way. Mechanisms thus nedd to be established to enable AI systems to do this too, in keeping with engineering best practices.

Steps to improve gender diversity

Clearly, we need to improve gender diversity in the development of AI. We should focus on three ways of doing so.

First, we have to follow diversity developments and assess the impact of policies and initiatives in this regard. In this context, the HUMAINT project and the DTIC at UPF are collaborating on the diviniAI initiative to measure how diverse the main AI conferences are. As part of this initiative, the HUMAINT project and the DTIC will be holding a hackfest on the Poblenou campus on 1 June.

Second, we have to give more visibility to the women already in the field of research, and of technology in particular, in order to increase the impact of their work. One example of the efforts to do this is the Wisibilízalas contest, targeted at children, teachers and parents. The initiative aims to raise awareness of the importance of women in technological fields and to provide good role models for future generations.

Third, we need mentoring programmes such as that run by the Women in Music Information Retrieval Group, which I participate in, to ensure that women who work in this field do not abandon their research careers when they make the leap to more senior positions, a reality we currently face.

We all have conscious and unconscious biases when we talk to or about women in the field of technology. Artificial intelligence has the potential to overcome this problem, but it can also inherit and perpetuate biases. We need more women in AI to make sure that systems are developed with women and for women’s welfare.