Artificial intelligence programs designed to assess candidates for university places and bank loans may be inherently racist and sexist, according to technology experts.
Researchers believe that machine-learning algorithms, far from making the world more equitable, are now mimicking society’s inequality by discriminating against women and ethnic minorities.
“This is beginning to come up a lot in areas like shortlisting people for jobs, insurance, loans – all those things,” Noel Sharkey, Co-Director of the Foundation for Responsible Robotics, told the BBC’s Today program.
READ MORE: FaceApp scraps ‘racist’ filter after public outcry (PHOTOS)
Sharkey cited the example of research from Boston University in which an AI was trained to read Google News. When asked during a word association game, “man is to computer programmer what woman is to x”, the computer replied “homemaker.”
“We have a problem,” said Sharkey. “We need more women coming into this field to solve it.”
In the US, 20 percent of engineering school graduates are women but make up just 11 percent of practicing engineers. In the UK, nine percent of the engineering workforce is female.
Health data expert Maxine Mackintosh said “imperfect” datasets based on historical information skew AI replies. This, she said, was seen in London’s St George’s Hospital where pre-tests of medical school applications revealed that the hospital’s AI was discriminating against women as well as black and ethnic minorities despite new research showing that women are better doctors than men.
READ MORE: Artificial intelligence ‘vastly more risk’ than N. Korea – Elon Musk
“These big data are really a social mirror – they reflect the biases and inequalities we have in society,” she told the BBC. “If you want to take steps towards changing that you can’t just use historical information.”
The issue of AI’s struggles with racial diversity has been in the spotlight in recent times.
Last year, Microsoft was forced to take its machine-learning Twitter chatbot ‘Tay’ offline after it took just 24 hours to become a Nazi sympathizer.
Earlier this week, video surfaced online of a black man failing to get soap from an automatic dispenser in Lagos, Nigeria.
In the footage, Lagos resident Chukwuemeka Afigbo films a white man getting soap from the dispenser. When a dark-skinned man tries to do the same, the machine fails to respond. Eventually a white paper towel is passed under the device, immediately triggering the sanitizer.