Recruitment

Amazon scraps AI recruiting tool that showed bias against women

Amazon has reportedly scrapped an internal recruiting tool which was rejecting women candidates from the job. 

According to the reports, the AI-powered recruitment tool was created by a team at Amazon's Edinburgh office in 2014. The machine was built as a way to automate CV sorting and select the most promising candidates. However, it quickly taught itself to prefer male candidates over female ones. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it deprecated graduates of two all-women’s colleges.

According to the officials, Amazon tweaked the software to make it neutral to these particular terms but there was no guarantee that the machines would not devise other ways of sorting candidates that could prove discriminatory. 

 Bias at the workplace has always been a challenge in the workplace and the ability of a machine to solve workplace bias is intriguing. Researchers have become increasingly vocal about the dangers of prejudice, using technology.  In fact, there have been instances where technology has played an active role in supporting workplace biases.

Eg-Beauty.ai, a machine learning startup, held the world’s first AI driven beauty contest. It was an initiative by the Russia and Hong Kong-based Youth Laboratories and was supported by Microsoft and Nvidia. It was reported that more than 7,000 people submitted their pictures to have their attractiveness evaluated based on factors such as symmetry and wrinkles. However, the results turned out to be disappointing as out of the 44 winners, the majority were white, few were Asians, and only one was dark-skinned.

One thing that we all need to make sure is that AI is taught to identify patterns, and it can identify and learn any human bias that is present in the workplace. Hence, it still requires human touch to ensure it isn’t replicating existing biases or introducing new ones.

As for Amazon, the company has managed to retrieve some of what it learned from its failed AI experiment. In another news, it is said that the new team in Edinburgh has been instructed to give automated employment screening another try with a focus on diversity.

 

Browse more in: