Biased Algorithms: Can AI be Evil?

Smart Business

Biased Algorithms: Can AI be Evil?

While killer robots may exist only in science fiction, a growing number of people inside and outside the world of technology are concerned that AI can do harm through embedded and latent prejudices. For example, AI systems have produced embarrassing mistakes related to facial recognition based on inadvertent training with a narrow input set and may also embed bias in the hiring decisions they help make.

by Alan Earls

The most famous example occurred in 2015, when Google was forced to apologize after its new photo app labeled two black people as “gorillas.” It turned out the algorithm was trained using a database of facial Racist tags have also been a problem in Google Maps. For a while, searches for “nigger house” globally and searches for “nigger king” in Washington DC turned up results for the White House under former US president Barack Obama.

It is not just Google that has run across problems with biased algorithms. Flickr’s auto-tagging system came under scrutiny after it labelled images of black people with tags such as “ape” and “animal.” The system also tagged pictures of concentration camps with “sport” or “jungle gym.”

“More and more companies are using AI. The software can manage large amounts of data and react independently to inputs,” says Edna Kropp, digital engagement specialist at LivePerson, a provider of conversational commerce solutions, based in Berlin. People are only necessary to train the AI, she notes, and this is exactly where the problem lies. Even though artificial intelligence is an emotionless appliance, it is only as unbiased as the data provided by the trainer. In fact, she says, “The machines are usually programmed by white men.”

People are only needed to train AI – and that’s the problem!
Edna Kropp, Digital engagement specialist, LivePerson


In response to this concern, the EqualAI (Equal Artificial Intelligence) initiative founded by LivePerson CEO Robert LoCascio has been gathering support. The organization is pursuing a four-pronged program to encourage more women and people from a range of ethnicities to learn to code. This would enable them to pursue degrees in technology, to work with companies in eliminating bias in human and AIcentric hiring and promotion, and to identify and eliminate bias embedded in new and existing AI systems Kropp says current estimates are that around 80 percent of software developers and programmers are male. “The data with which these people train artificial intelligence represent their world,” she says. For example, if a programmer has mainly white friends, he will show the AI photos of white people for facial recognition. As a result, the AI will only be able to fall back on less diverse image material and will distinguish faces of non-white people less successfully.
The problems do not stop with racial proclivities within the AI, notes Kropp, by mirroring the world in which the dominant white male programmers live, many other problems can arise.

Biased Algorithms: AI needs Better data

“The solution to this problem is obvious: AI needs better data, more data, and, above all, more diverse data,” says Kropp. This will only happen when people from different social and cultural contexts program such machines. Unfortunately, too few people with these backgrounds have so far decided on a career in software development. “The EqualAI initiative is working to ensure that more women and people from minorities are trained in the technology,” says Kropp.
On a similar note, Stephane Rion, senior deep-learning scientist for Teradata in France, says that a key aspect of AI implementations within financial organizations is transparency: “More and more banks and financial institutions are focusing their efforts not only on developing the most performant predictive models to catch fraud or agree on a loan but also in understanding why a model made a specific decision.” In the area of deep learning, neural networks can have a large number of neurons or parameters that will affect the final decision; being able to understand this is vitally important for a bank when it comes to meeting regulations or even running an audit, he explains.

Read more

Next-Gen GNSS: Eyes in the Sky

Leave a Reply

Your email address will not be published. Required fields are marked *