Aug 25, 2017 6:21AM

New Research Shows Artificial Intelligence Is Sexist AF

Damn
Now that we're fanging toward the future at an unbearable rate, things are getting pretty real with artificial intelligence. And, according to new research, those that feel personally attacked by the white male now have to watch out for computers.
 
A computer science professor at the University of Virginia, Vicente Ordóñez, started noticing some disturbing patterns in the guess work of image-recognition software. "It would see a picture of a kitchen and more often than not associate it with women, not men," he told Wired. And so he and his colleagues started testing two large collections of photos used in training software.
 
The two collections they chose are major: ImSitu, which the University of Washington created; and COCO, first coordinated by Microsoft, and now cosponsored by Facebook and MightyAI.
 
Apparently, images of shopping and washing are linked to women, while things like coaching and shooting are linked to men. This is thanks to labelling that is done by real humans. Worse, even, is that machine-learning software that's trained off this data isn't just mirroring these biases… it's amplifying them.
 
When AI-based systems become more common and start doing more complex things, these biases could have a really terrible effect. The Wired article uses the example of "a future robot that when unsure of what someone is doing in the kitchen offers a man a beer and a woman help washing dishes," which is funny, but also pretty hectic.
 
It isn't the first time AI has been a bit of a dick, either. In 2015, Google's photo service tagged people of colour as gorillas. And Microsoft asked software trained off Google News to complete the statement: "Man is to computer programmer as woman is to X." And, it replied, "homemaker." Rude.
 
While researchers are trying to even out the biases, some are asking when exactly they should "change reality to make our systems perform in an aspirational way". Princeton researcher Aylin Caliskan is worried about the neutralisation, saying that it puts us at risk of losing essential info, and that "datasets need to reflect the real statistics in the world". 
 
It's a mad, mad world.
 
Photo: Tumblr

Hayley Morgan