Skip to main content

In 2019, 98% of the 500 companies chosen by Fortune magazine as “companies of the moment” used an Applicant tracking system (Ats) in their process of hiring candidates, and not only for reasons of cost or efficiency, but precisely to eliminate discrimination.

Algorithms and Artificial Intelligence should, in fact, help overcome biases even in the personnel selection phase, ensuring impartiality. But even the AIs could have prejudices, ending up replicating human limitations and consequently damaging the companies which unwittingly trust their algorithms. Yet the question arises: how can software be biased? The prejudices of a human being can surface in several ways.

For example, when the recruiter is confronted with a potential candidate, their perception of the candidates voice or appearance can subtly influence evaluation. These kind of biases can also emerge from simply  reading the information written on their CVs. People with names and surnames that sound ‘whiter’ have, for example, 75% more chances of getting an interview than those with Asian names and 50% more than black-sounding names; moreover, male names have 40% more chance than female ones.

This data was found in a study conducted by researchers from the University of Toronto and Stanford University in 2016. It is therefore assumed that an effective AI is designed to overcome this type of discrimination. Unfortunately, however, not even Amazon was able to create one, with the news came some time ago: the recruitment software used by the online distribution giant, in that case, selected more men engineers than women, with the same set of skills. Women’s representation has been increasing in the engineering field (with the Study Centre of the National Council of Engineers finding 16% of engineering graduates being women in 2000, and peaking at 28% in 2017), despite this it is likely that the AI ​​had applied reasoning based on past trends, in other words reasoning that if previous engineers were almost exclusively men, then this should have continued to be.

So how do you improve the process?

One of the solutions is “Blind Recruiting”, to reduce bias, CVs are anonymized, eliminating any reference to the name, surname, sex, age, and nationality.

Deckx, in addition to anonymizing CVs, generates standardized cards that highlight the skills of each candidate, in order to offer a selection process based solely on skills, and not on the person.

Leave a Reply