“You’re hired!”
When Amazon launched its AI recruiting tool way back in 2014, the company hoped that it could potentially help their HR team revolutionize hiring practices and reach their hiring verdict more efficiently.
But sadly, that wasn’t meant to be. The tool quickly developed a clear gender bias which automatically limited the number of female candidates selected for the next stage - All due to a lack of strong female candidates in the training data provided to the AI model.
With AI tools entering every facet of professional life, this can become a major problem for HR departments who are often required to lead the charge against any potential bias.
A Gartner, Inc. survey of 179 HR leaders on January 31, 2024, revealed that 38% of HR leaders are piloting, planning implementation, or have already implemented generative AI (GenAI), up from 19% in June 2023.
At present, 38% of HR leaders are considering or have already implemented AI in human resources processes. To avoid repeating the mistakes of before, it’s vital that HR professionals understand what led to Amazon’s HR breakdown.
Key Takeaways:
The company created this AI tool with the objective of automating the entire resume screening process, to efficiently identify the best talent across the globe. Therefore, it trained the AI tool on resumes submitted to Amazon over a ten-year period, with a focus on those of successful candidates.
The team shortlisted approximately 500 computer models, which crawled through top-performing resumes from the previous years to identify around 50,000 key terms and attributes. These were deemed to be prerequisites for specific job positions, with the tool using these terms as a criterion to identify suitable candidates based on a rating scale of 1 to 5 – similar to how products are rated on Amazon.
However, the shine swiftly faded away when the company realized that the tool had imitated both the strengths and weaknesses of a manual screening process.
Reuters was the first to report on the failure of Amazon’s AI recruiting tool, where by 2015, it had become evident that the AI was not rating candidates in a gender-neutral manner.
Oxford University researcher Dr. Sandra Wachter says:
Amazon attempted to adjust the algorithms to be neutral but ultimately decided that the tool could not be reliably unbiased and scrapped the project.
Amazon stated that the tool “was never used by Amazon recruiters to evaluate candidates.” However, it did not deny that their recruiters had a look at the recommendations provided by the AI tool. They apparently now use a “watered down version”.
While Amazon’s AI fail acts as a warning story, companies across the globe have already started integrating AI into their HR tools. You’ll find a bevy of enterprise tools that are changing the face of how we approach HR - You just need to find the right one for you.
Don’t agree with AI in HR? Share your thoughts with us!