March 6, 2023, 12:32 PM
March 6, 2023, 12:32 PM
Since it launched at the end of 2022, the ChatGPT artificial intelligence program has not ceased to be a topic of conversation, both among those who admire this technological advance and those who fear its repercussions.
Much of the debate has centered on the uses this intelligent chatbot could have, which is capable of answering almost any question a user asks and producing texts that They look like they were written by a human.
Will the students use it to do their homework? And the leaders to write their speeches? Could I even write this article you’re reading?
In addition to the great concern about whether this artificial intelligence (AI) program will put millions of people who today perform tasks that this machine can perform in a matter of seconds out of work, another controversy has to do with copyright.
ChatGPT uses Information you get mainly from the internet. But in general it does not cite sources, leading to accusations of plagiarism that have already led to legal complaints.
But behind the noise that this innovation has generated, and the progress it means for technologies that use AI, hides another controversy that is much less well known.
It has to do with the hundreds of thousands of workersmany low-income, without whom AI systems like ChatGPT would not exist.
We’re not talking about the programmers who design the algorithms, who often work in places like Silicon Valley and earn good salaries.
We talk about the “hidden workforce”, as the non-profit association called it Partnership on AI (PAI), which brings together academic, civil society, media and industry organizations involved with AI.
Who make up this hidden force? People subcontracted by large technology companies, generally in poor countries in the Southern Hemisphere, to “train” AI systems.
taggers
These men and women perform a tedious task -and potentially harmful to mental health, as we will see later- but which is essential for programs like ChatGPT to work.
Consists in tag millions of data and images to teach the AI how to act.
Take for example the chatbot that is causing a sensation.
When you ask ChatGPT a question, the program uses some 175 billion “parameters,” or variables, to decide what to answer.
As we already mentioned, this AI system uses information obtained from the internet as its main source. But how do you distinguish the contents? Thanks to these references, which are “taught” by human beings.
“There is nothing smart about artificial intelligence. He has to learn as he is trained,” explains Enrique García, co-founder and manager of DignifAI, an American company based in Colombia that is dedicated to hiring these “data recorders,” to BBC Mundo.
These professionals, better known as “data labelers” (data labelersin English), identify information, such as text, images and videos, and tell the program what is what, so that the machine can understand what what is and learn in which context to use it.
In the tech industry they call this type of task “data enrichment“.
But ironically, and despite the fact that it is essential work for AI development, data enrichers make up the poorest link in the production chain of large technology companies.
A fact that was recognized by Partnership on AI.
“Despite the critical role these data enrichment professionals play, a growing body of research reveals the precarious working conditions faced by these workers,” the agency said.
“This could be the result of efforts to hide the AI reliance of this large workforce by celebrating the efficiency gains achieved by this technology. Out of sight is also out of mind,” denounced the coalition, which also includes OpenAI, the company that created ChatGPT.
Less than $2 an hour
A TIME magazine investigation revealed that many of the data taggers who were outsourced by OpenAI to train their ChatGPT received salaries of between $1.32 and $2 an hour.
According to a report by journalist Billy Perrigo, the technology company, which counts Microsoft among its main investors, outsourced the data enrichment work through a company of outsourcing called Sama, based in San Francisco, which hired workers in Kenya to carry out that project.
Through a statement, an OpenAI spokesperson said that this company was responsible for managing the salaries and working conditions of the taggers hired to work on the ChatGPT program.
“Our mission is to ensure that AI benefits all of humanity, and we work hard to build safe and useful AI systems that limit bias and harmful content,” the spokesperson said.
Sama, who also hires taggers in other low-income countries like Uganda and India for clients like Google and Meta (owner of Facebook), is touted as an “ethical AI” and claims to have lifted more than 50,000 people out of poverty.
However, Martha Dark, director of the British activist organization Foxglove, whose goal is to “stand up to tech giants and governments, for a future where technology is used to benefit everyone, not just the rich and powerful “He told BBC Mundo that the big technology companies use the outsourcing to pay these workers much less than it corresponds.
“All of these companies are multi-billion dollar companies and it’s frankly inappropriate that they’re paying $2 an hour to the people who make these platforms possible,” he said.
But for Enrique García, from DignifAI, the controversy over salaries “is a matter of perspective.”
In Europe and the United States, it can be understood that earning that is not enough, he observes, but in other countries it can represent a good salary.
“A lot of people criticize our industry for the issue of pay, but at DignifAI our wage floor is $2.30 an hour, And that represents 1.8 times the minimum wage in Colombia.“he points out.
“If the project is more complex and requires expert recorders, such as architects or doctors, the pay can go up to $25 an hour,” he says.
Although he acknowledges that there are some companies that pay below the minimum wage, he considers it unfair to focus only on this sector.
“There are dynamics of outsourcing in many industries, not only this one, so it’s also not fair to label us as the ‘digital sweatshops´ (digital sweatshops),” he says.
Social impact
García also points out that there are several companies in the sector, like his, that have a social impact and have the objective of “increasing people’s productivity and dignity.”
In fact, the motto of DignifAI, which is supported by various aid agencies, is: “Outsourcing dignity through artificial intelligence” (“Outsourcing dignity through artificial intelligence”)
The company is based in Cúcuta, on the border between Colombia and Venezuela, and its mission is give work to Venezuelan migrants who cross into the neighboring country and also internally displaced Colombians.
“Many of them before working with us were earning 4 or 5 dollars a day. For this vulnerable population that does not have options in the labor markets, earning 1.8, the Colombian minimum wage is quite attractive,” he says.
Ingrid, a 42-year-old Venezuelan who arrived in Colombia at the end of 2018realize it.
The graduate in Education, who preferred not to give her last name, told BBC Mundo that she has not been able to teach, as she did in her country of origin, because she has not yet been able to validate her title, and said that working as a scorer for DignifAI It allowed him to earn a living and also train in another profession.
“You work four hours a day and I have been able to spend the remaining time taking a graphic design course,” he said, about his next work project.
Although she no longer works as a note taker, because she was promoted to the position of project supervisor, I have no hesitation in recommending this job..
“It is more profitable, less exhausting and better paid to be a waitress, assistant part time or do physical chores,” she observes, noting that most of her classmates are housewives, street vendors or students.
Mental health
Beyond pay, another controversial issue surrounding data taggers is the effect of the job on their mental health.
It is not the tediousness of the task that worries some experts -although it is another criticism that is made about this work- sino the toxic material to which some annotators are exposed.
And it is that, part of the function of these trainers is to teach the AI program what information is not suitable for publication.
To do this, some -not Ingrid’s case- must delve into the darkest corners of the internet and catalog all the vast wealth of violent, sinister and perverse material that resides there, in order to show the machine to ignore the rotten side of the great network of networks.
But according to Foxglove’s Martha Dark, getting this job done, which is vital, “can cause post traumatic stress disorder and other mental health problems in many workers.
His body represents a former employee of Sama who worked as a Facebook moderator in Kenya and in 2022 sued both Sama and Meta, owner of the social network, for the psychological damage he suffered, a case that is still being discussed in the courts of Nairobi.
“These jobs come at a mental health cost of those who do them and should receive adequate psychiatric care as well as a fairer salary,” Dark told BBC Mundo.
According to the activist, the big technology companies have plenty of financial resources to provide this type of assistance, but they don’t do it because “they put profits above safety of their workers”.
Enrique García acknowledges that large companies could invest more in hiring taggers, but he affirms that putting too many demands on them could make them decide to look for annotators elsewhere.
“It may be that Big Tech could pay more, but we are very grateful for the opportunities“, says.
“Either we get defensive about the stinginess of the client or we accept the opportunities that are out there, which pay above the legal minimum,” he says.
“At least we’re bringing income generation opportunities here where, without this alternative, there wouldn’t be any.”
Now you can receive notifications from BBC Mundo. Download the new version of our app and activate them so you don’t miss out on our best content.