Item Amazon scraps secret AI recruiting tool that showed bias against women (Thanks to Mark Charters for the tip.)
Amazon.com Inc’s (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women.
The team had been building computer programs since 2014 to review job applicants’ resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters…
“Everyone wanted this holy grail,” one of the people said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”
But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way.
That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.
We’ve seen before that “algorithms” are called “racist.” Read it.
Feed the algorithm, the curve-fitting AI, measures such as use of purple hair dye, purchased tampons or bought video game, and so forth, and it will, for the painfully obvious reasons, pick out men from women. Not with perfection, of course, but it will be pretty good.
Then, since as everybody knows, but many don’t like knowing, men at the extremes are better at analytic tasks than non-men, an algorithm to maximize a candidate’s ability to code not fed sex, but measures highly predictive of sex, will pick out more men than women. The algorithm will be “biased” (to reality).
There are only two ways to avoid the algorithm suggesting more men than women: (1) feed the algorithm only measures which in no way are predictive of sex; but, since men (at the extremes) are better than non-men at coding, the algorithm will do a lousy job predicting coding success; (2) instruct the algorithm to spit out Equality; which also will force the algorithm to do a rotten job.
Equality is defined as the hope in absence of all evidence that men and women are equal. But if men and women were equal, we would not even know to say “men” and “women.”
Bias is defined as politically unacceptable result.
Any algorithm designed to accurately predict (!) high performing programmers will obviously show a preference for men. Because men massively outnumber women at the high performance end of computer science. Just as they do at all technical subjects.
An accurate algorithm will preferentially select male pole vaulters and sprinters over females, as well. And for identical reasons.
We’ve been down this road before in Griggs v Duke Power. No selection tool is perfect. All will yield hits, misses, false alarms, and correct rejections. All will trade off hits and false alarms as the selection criterion is shifted back and forth.
Amazon had the good sense to dump the tool before they ended up in SCOTUS for the crime of trying to hire the best people instead of the “right” people.
Ironically, Harvard is in Federal court right now for trying to admit the “right” people instead of the best people.
Fun to watch. Looking forward to the correctly weighted employment lottery by Woke and Neuter, LLC: DiversaPick.
Does Amazon hire equal numbers of women on the loading docks, tossing 150 lb packages on the trucks all day? Just curious. Equal numbers to work night shift? Again, just curious?
Good to know Amazon is NOT interested in quality. If you want that, go elsewhere.
One other thought—if years of TV portraying techs like Penelope on Criminal Minds, Cable on Bull (she was offed, however…), Abby on NCIS and a plethora of nearly 100% female techs and female coroners has failed, maybe, just maybe, men and women ARE different and women DO NOT want the jobs politics is now enslaving them in. Just saying.
How funny! Nonetheless, I don’t know if I would want a darn machine to sort out several applications for me. I think I would prefer having actual people look at and sort out the job applications for a company instead of some computer.
Years ago the San Francisco Bay Area Rapid Transit System had a phone app for reporting crimes. They dropped the app when 60% of the perps were reported as black. Blacks were about 10% of the riders. Can’t have those racist apps.
The Reuters article contained this gem: “…that was no guarantee that the machines would not devise other ways of sorting candidates that could prove discriminatory, the people said.”
I guess their people know, just know, that the algorithm programmers, despite their best efforts, could not avoid writing an algorithm that would adjust itself to find male applicants no matter what the resumes looked like.
While not well correlated to the thread, this cartoon does review Bayes’ theorem in an interesting way:
https://imgs.xkcd.com/comics/modified_bayes_theorem.png
Females are better than males at bearing and nurturing babies and children. Babies and children are our future. Glorify the mothers and bestow on them the wealth. Commit the useless men to rot in cubicles typing algorithm code.
Pity I can’t give Uncle Mike a tick.
Anyhow, I guess that an “algorithm” that specifies the empathetic requirements of mothers, primary school teachers, nurses, and a host of other indispensable like “professions” would be “discriminatory” because women would win hands down with almost no competition from the hated white male.
But I really don’t give a ***. I’m really, really glad that my Mum and my wife are women.
I try to avoid ticks. Some of them carry Lyme disease. If I did have one though, I’d want to give it away.