A woman¡¯s place is in the home ¨C at least according to AI...
Far from being impartial, not-so smart algorithms reinforce sexual, racial and class prejudice picked up from humans
A balding man in jeans and glasses is standing at the kitchen stove stirring something with a wooden spoon. Artificial intelligence draws on its vast wealth of knowledge to label this image ¡°kitchen, stove, wooden spoon, woman¡± ¨C the inference being that if someone¡¯s cooking, it¡¯s a woman.
Researchers at Virginia University have released a report confirming a frequent criticism ¨C that far from avoiding human prejudices, artificial intelligence compounds them.
The investigation focused on two image banks commonly used to train computers to process images. In 33% of the photos of people cooking, the cooks were men. Following attempts to ¡°train¡± the computer in recognition, the software continued to believe that 84% were women and only 16% were men.
¡°We know that technology fed on big data can enhance prejudice because the prejudice is implicit in the data,¡± according to the report. The research shows that if sexual discrimination exists in the original data, predictive technology will identify and highlight it.
Technology fed on big data at times can enhance prejudice because it is implicit in the data Virginia University researchers
Take the case of Tay, the AI chatbot designed by Microsoft that was plugged in to Twitter so it would learn from other users. In less than 24 hours it was withdrawn, due to its Holocaust denials, abuse of other Twitter users and the defense of Donald Trump¡¯s wall, as it proceeded to mimic the worst of human bias in its tweets.
There are many such examples of learning machines merely amplifying our prejudices, despite the promise that AI would eliminate human bias from the decision-making process.
Google, for example, started to label black people as gorillas and Google maps searches for ¡°nigger house¡± identified the White House during Obama¡¯s presidency. Similarly, black Flickr users were classified as chimpanzees while Apple¡¯s intelligent personal assistant, the all-knowing Siri, is struck dumb when the phone¡¯s owner mentions rape.
The list goes on. Nikon software warns the photographer that someone has blinked when they are in fact photographing someone Asian. The HP WebCam can identify and follow white faces but not brown or black ones. And the first beauty contest judged by a computer picked just one dark-skinned contestant as a winner against 43 white ones. In America, Amazon fails to target African American neighborhoods for its top promotional campaigns and Facebook allows its advertisers to exclude ethnic minorities while including people they have identified as anti-Semitic and young people flagged up as vulnerable and depressed.
Google began to label black people as gorillas and Flickr classified them as chimpanzees
¡°Promising efficiency and impartiality, algorithms distort education, increase debt, trigger mass imprisonment, hit the poor in every way possible and undermine democracy,¡± says Cathy O¡¯Neil, the Harvard PhD and data scientist who wrote Weapons of Math Destruction, which explores the disastrous impact algorithms have had on society. ¡°Going to university, asking for a loan, being imprisoned or looking for and finding work ¨C all these areas of our lives are increasingly controlled by secret programs that deal out arbitrary punishment,¡± she says.
As O¡¯Neil points out, algorithm bias can be far more dangerous and far-reaching than human bias. Journalists from the investigative journalism organization ProPublica confirmed this some months ago when they discovered that a risk-assessment program used by the US justice system for prisoners was notably racist. Afro-American defendants were twice as likely to be labeled as potential reoffenders than their white counterparts ¨C and consequently treated more severely by the penal system. Meanwhile, the white defendants who did reoffend had been labeled low risk. Citizens, and of course prisoners, have no idea that their future is being decided by a flawed program that is as racist as the most bigoted judge. The only difference might lie in the sangfroid with which the racism is delivered.
Research carried out by Carnegie Mellon University discovered that Google is less likely to make adverts for well-paid jobs visible to women. The programs used in the personnel departments of some companies are drawn to ¡°white¡± names rather than names associated with ethnic minorities. The police authorities in a number of cities use programs to detect crime hotspots, leading to over-policing in these areas and a greater number of arrests, thereby perpetuating a negative cycle. And, of course, insurance policies are more expensive and less generous when it comes to payouts for residents of predominately black neighborhoods. ¡°The result is that we criminalize poverty, believing that our tools are not only scientific but impartial, ¡° says O¡¯Neil.
Promising efficiency and impartiality, algorithms hit the poor in every way possible and undermine democracy Cathy O¡¯Neil, Harvard data scientist
As O¡¯Neil points out in her book, while the algorithm dilemma is at times due to data selection and underlying social prejudice that the software flags up, the biggest problem is economic in nature. ¡°When they are designing systems to find clients or manipulate people in debt, increased profits appear to demonstrate that they are on the right path ¨C that the software is doing its job. The problem is that the benefits end up acting as a substitute for the truth,¡± says O¡¯Neil, in what she calls a dangerous and recurrent mix-up.
Facebook allows its algorithms to select and sell adverts to ¡°people who hate Jews¡± and ¡°vulnerable adolescents¡± because that¡¯s where the money is; if it pays, they can't be wrong.
These problems have all been flagged up by journalists, researchers and institutions. But how can we be aware of the effects these algorithms are having on our daily lives? Most women will never realize that a job advert hasn¡¯t come their way. And most communities will remain oblivious to the fact they are over-policed because of biased software. According to O¡¯Neil, Facebook and Google, for example, recognize the problem and even explain it, but they become less cooperative when it comes to monitoring these biases. Each company prefers to keep its algorithms secret as if they were in possession of the tech equivalent of the Coca-Cola recipe.
The software is doing its job. The problem is that the benefits end up acting as a substitute for the truth Cathy O¡¯Neil, Harvard data scientist
If the algorithm is being used in the justice system, it should be transparent, accessible, debatable and amendable like the law itself. So say a growing number of specialist organizations, such as the Algorithm Justice League or AI Now, who insist that the problem with intelligent machinery is rampant social prejudice and not a Terminator-style apocalypse.
And it¡¯s not about to go away any time soon. Just days ago, algorithms were identifying homosexuals through facial recognition, sparking more controversy. In America, half the population has its face registered on facial recognition databases. Meanwhile, it transpires that the network giants are aware of our sexual orientation even if we don't use their network. ¡°We can't depend on the free market to correct these mistakes,¡± says O¡¯Neil.
English version by Heather Galloway.
Tu suscripci¨®n se est¨¢ usando en otro dispositivo
?Quieres a?adir otro usuario a tu suscripci¨®n?
Si contin¨²as leyendo en este dispositivo, no se podr¨¢ leer en el otro.
FlechaTu suscripci¨®n se est¨¢ usando en otro dispositivo y solo puedes acceder a EL PA?S desde un dispositivo a la vez.
Si quieres compartir tu cuenta, cambia tu suscripci¨®n a la modalidad Premium, as¨ª podr¨¢s a?adir otro usuario. Cada uno acceder¨¢ con su propia cuenta de email, lo que os permitir¨¢ personalizar vuestra experiencia en EL PA?S.
En el caso de no saber qui¨¦n est¨¢ usando tu cuenta, te recomendamos cambiar tu contrase?a aqu¨ª.
Si decides continuar compartiendo tu cuenta, este mensaje se mostrar¨¢ en tu dispositivo y en el de la otra persona que est¨¢ usando tu cuenta de forma indefinida, afectando a tu experiencia de lectura. Puedes consultar aqu¨ª los t¨¦rminos y condiciones de la suscripci¨®n digital.