The AI did not like women

Reuters reports: Inc’s (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women.

Garbage in garbage out, ya know. Misogyny in misogyny out. Underestimation of women in underestimation of women out.

The trouble is, they used resumes from the past ten years to train the computer models, and most of those came from – you’ll never guess – men.

Top U.S. tech companies have yet to close the gender gap in hiring, a disparity most pronounced among technical staff such as software developers where men far outnumber women. Amazon’s experimental recruiting engine followed the same pattern, learning to penalize resumes including the word “women’s” until the company discovered the problem.

Did they name the recruiting engine “Damore”?

Jordan Weissmann at Slate looks at the implications:

All of this is a remarkably clear-cut illustration of why many tech experts are worried that, rather than remove human biases from important decisions, artificial intelligence will simply automate them. An investigation by ProPublicafor instance, found that algorithms judges use in criminal sentencing may dole out harsher penalties to black defendants than white ones. Google Translate famously introduced gender biases into its translations. The issue is that these programs learn to spot patterns and make decisions by analyzing massive data sets, which themselves are often a reflection of social discrimination. Programmers can try to tweak the A.I. to avoid those undesirable results, but they may not think to, or be successful even if they try.

I feel as if feminism has been patiently explaining this since before Gutenberg, only to be sneered at and called politically correct (or a cunt). Biases against women are everywhere, they’re baked in, it doesn’t work to try to operate as if that all ended in 1971.

H/t Screechy Monkey

8 Responses to “The AI did not like women”