Today: November 26, 2024
March 6, 2023
4 mins read

How gender discrimination is reproduced in algorithms

How gender discrimination is reproduced in algorithms

Illustration: Pablo Blasberg.

An algorithm that learned to associate images of household appliances with women and another that, when consulted by prominent people or suitable for certain professions, only offers the profile of men: two examples that help us to think about how discrimination based on gender is reproduced in social media. data -social and historical- with which artificial intelligence (AI) is trained, whose impacts on the lives of women and LGBTIQ+ are addressed by specialists in dialogue with Télam.

The fact that AI is increasingly present in everyday life is not something new, although its ability to respond to increasingly complex demands, such as generating academic texts, images or artistic productions almost indistinguishable from those that a human person could produce, is.

It is achieved through automatic learning techniques, or machine learning, with which the AI ​​is trained so that it can emulate behaviors or generate predictions or recommendations from a large amount of data and with varying degrees of autonomy.

The problem is that, if there never was -or is not recognized- the participation of women and LGBTIQ+ in certain areas of life and, therefore, there are no data that account for this, probably the AI ​​will reproduce these types of disparities by default.

This occurs because “data are often thought of as something objective, forgetting that they are the product of subjective and asymmetrical social relations”, he explained Victoria Dumasdirector of the Data Science and Artificial Intelligence team at the Sadosky Foundation.

“AI models know how we are and how we were as a society, but they don’t know where we want to go, and this is the key to where we need to take action.”Victoria Dumas

In dialogue with Télam, he emphasized the necessary “training and awareness of biases (gender, class, race) for the people who are in charge of designing and programming the AI ​​algorithms”, while They will only be able to develop systems that do not discriminate “if they are able to identify and master their own unconscious biases.”

It is not surprising, then, that when a chatbot is consulted by the name of prominent people in philosophy, for example, the response is a list of only men, given that “statistically women philosophers are underrepresented in the training data” , he exemplified.

This is particularly problematic given the growing use of these “smart” responses in different settings, including education and work.

Some time ago, the Amazon company had to remove a recruitment software for workers that discriminated against women from its offers in technical positions, having been trained with “the profiles of applicants for (those) jobs in recent years, which in their majority were men.”

“AI models know how we are and how we were as a society, but They don’t know where we want to goand in this is the key where it is necessary that we take action ”, pointed out the expert.

Photo 123RF
Photo: 123RF.

For her part, the co-founder and director of the “DataGénero” Data Observatory, Ivana Feldfeberassured that one of the “big concerns” in the matter is that more and more governments “implement AI algorithms that affect our lives and we know nothing about it.”

Operating inadvertently, these systems are used “to make decisions that directly affect people,” for example to determine who to give loans, medical or social care, or even to measure a legal sentence or the risk of recidivism, he said.

The “Not my AI” project (“No mi IA”, in Spanish), developed a map of “problematic projects” that use AI in Latin America, and Argentina registered at least three, two of them related to video surveillance.

There is also the Technological Platform for Social Intervention, a program that the government of Salta and the mega company Microsoft promoted in 2018 and which aspired to be able to predict adolescent pregnancy through artificial intelligence.

“In addition to how stigmatizing the proposal is, which only collects data on girls but not on the people who get them pregnantthe question is what is done afterwards with these data that refer to such a complex problem”, questioned Feldfeber, who emphasized that these projects “the only thing they seek is to mark” social groups through AI.

Despite the complexities it presents, the specialists highlighted the opportunities that AI can provide, approached from a gender perspective, in the struggles of women and LGBTIQ+.

Examples of this are the chatbots designed to receive complaints about gender violence to expedite court times, or Themis, a proofreader that warns about sexist words and proposes inclusive alternatives.

So is AymurAI, an AI-based computer program that aims to help judiciaries collect and dispose of anonymized data on gender-based violence, developed by DataGénero together with Mexican and Swedish colleagues.

Consulted by Telam, Yasmin Quirogaa lawyer and member of the observatory, explained that the initiative arose from the lack of unified data on gender violence in the country and, in turn, the low level of trust in the Argentine judiciary, which the latest Latinobarómetro survey stood at 16%.

In cases of gender violence, this translates into “low levels of reporting, mistrust in the justice system and less access” to it, he said.

In that framework, AymurAI It was intended as a way to expedite the collection of data on gender violence from the automation of its registration, in a way that allows “understanding gender violence and supporting the development of public policies”, detailed its creators.

“Data have a transformative capacity because they allow us to know the reality of people. They are concrete evidence, beyond the fact that they are not objective,” Quiroga said.

“These data already exist and are in great demand by society, it’s a matter of starting to work on them and publish them,” said the creators of AymurAI, which is already up and running and available “for those who want to implement it.”

And they concluded: “We do not expect these tools to solve the problem of gender violence, but we do believe that they have a lot to contribute.”



Source link

Latest Posts

They celebrated "Buenos Aires Coffee Day" with a tour of historic bars - Télam
Cum at clita latine. Tation nominavi quo id. An est possit adipiscing, error tation qualisque vel te.

Categories

After meeting with Lula, Juscelino Filho remains in government
Previous Story

After meeting with Lula, Juscelino Filho remains in government

Copei called the opposition primary commission criminals
Next Story

Copei called the opposition primary commission criminals

Latest from Blog

cubanet-cuba-regimen

Negotiate with the regime? When and how?

In these regimes with a totalitarian vocation, the crack does not begin at the bottom, but at the top. The post Negotiate with the regime? When and how? appeared first on Cubanet.
Go toTop