Career

Artificial intelligence in applications: algorithms are not objective

Artificial intelligence could also change the future application process.

Artificial intelligence could also change the future application process.

Rawpixel.com/Shutterstock

Actually, when applying for a job, the main focus should be on qualifications. It should not matter whether it is sent by a man or a woman, whether applicants have a migration background or whether they come from West or East Germany. The attractiveness, age or occupation of the parents should play just as little a role as religion or height.

But the reality looks different. Large West German men, named Thomas or Michael and mostly from academic households, dominate the boards of Dax companies as well as the management levels of many other companies and institutions. The phenomenon is called the “Thomas Principle” – and refers to the tendency of managers to prefer to encourage people who are similar to them.

Artificial intelligence is designed to prevent discrimination when applying

Even when applying, people who do not comply are disadvantaged. Numerous studies have now proven this. Young people with a migration background find it more difficult to find an apprenticeship position, applicants with a Turkish or Arabic sounding name and mothers with children are less likely to be invited to job interviews. The case of a Berlin woman who was rejected by a company because of her East German origins occupied the labor court a few years ago.

Artificial intelligence (AI) should finally end this discrimination. According to many AI fans, computers are absolutely neutral. Personal sympathies and antipathies should therefore no longer play a role. In addition, companies expect great time savings and cost savings if software, for example, takes over the pre-selection of candidates based on the application texts received and even initial interviews.

Numerous young companies want to support the HR departments in this. The Bonn startup Candidate Select, abbreviated to Case, uses artificial intelligence to compare university degrees. For example, the algorithm relates an applicant’s grade to that of fellow students. Because, Case argues, depending on the context, a 1.3 could only be an average grade and a 2.0 could even be the best grade.

The Heilbronn-based company 100 Words analyzes applications from a linguistic psychological perspective and assigns the authors appropriate personality traits and competencies. The startup Retorio from Munich examines applicant videos using an algorithm. And the Swedish recruiting agency The Next Generation offers the robot Tengai for job interviews.

A resounding success has not yet been achieved despite many pilot projects. A major problem is that the software is usually fed with existing data. And the previous discrimination is reflected in this data.

The application software from Amazon, for example, disadvantaged women for one simple reason: The tech company had previously hired mostly men – and that’s why AI preferred them too. She adopted the “Thomas Principle”.

Algorithms learn racism and sexism

In addition, the programmers themselves are often male, well educated and white. There are many examples that also prove racism and sexism in algorithms. Automatic image recognition works significantly worse for women and blacks. Voice assistants such as Siri, Echo or Alexa recognize commands from white people better than from African Americans.

Surveys show that there is great skepticism in Germany about the use of artificial intelligence in the application process. The company Precire from Hanover, which analyzes the communication of applicants, for example in video interviews and whose customers include Vodafone, RWE or Frankfurt Airport, is about to end, according to “Spiegel”.

Discrimination also begins very early in life – sometimes to a shocking extent. In the United States, black babies are two to three times more likely to die than white babies in their first year of life. This is not just due to the poorer living conditions of African Americans: when black babies are treated by white doctors, they are three times more likely to die than white babies. When treated by black doctors, however, the death rate drops by 39 to 58 percent, US scientists from various institutions found in a large study in 2020.

Discrimination is not always so clearly visible in numbers like these. Structural racism and sexism and the major role that income, origin and education of parents play in the chances of their children can hardly be changed by individual companies. But becoming aware of the complexity of the topic and your own prejudices and seeking support to remedy this is a start.

cm

Tags

Related Articles

Back to top button
Close
Close