LinkedIn’s job search AI was biased. The company’s solution? More AI.

More companies are using AI to recruit and hire new employees, and AI can take into account almost every step of the hiring process. The Covid-19 has fueled new demand for these technologies. Both Curious thing and HireView, companies specializing in AI-powered interviewing, have reported an increase in business during the pandemic.

However, most job searches begin with a simple search. Job seekers turn to platforms like LinkedIn, Monster, or ZipRecruiter, where they can upload their resumes, browse job openings, and apply for vacancies.

The purpose of these websites is to match qualified candidates with available positions. To organize all these openings and candidates, many platforms use AI-based recommendation algorithms. Algorithms, sometimes called match engines, process job seeker and employer information to build a list of recommendations for each.

“You usually hear the anecdote that a recruiter spends six seconds looking at your resume, right? Said Derek Kan, vice president of product management at Monster. “When we take a look at the recommendation engine that we created, you can reduce that time to milliseconds. “

Most match engines are optimized to build apps, says Jean Jersin, the former vice president of product management at LinkedIn. These systems base their recommendations on three categories of data: information that the user provides directly to the platform; data assigned to the user based on other people with similar skills, experiences and interests; and behavioral data, such as how often a user replies to messages or interacts with job postings.

In the case of LinkedIn, these algorithms exclude a person’s name, age, gender, and race, as the inclusion of these characteristics can help bias automated processes. But Jersin’s team found that even so, the service’s algorithms could still detect patterns of behavior exhibited by groups with a particular gender identity.

For example, while men are more likely to apply for jobs that require work experience beyond their qualifications, women tend to only choose jobs for which their qualifications match the job requirements. The algorithm interprets this variation in behavior and adjusts its recommendations in a way that inadvertently disadvantages women.

“You could recommend, for example, higher jobs to one group of people than another, even if they are qualified at the same level,” Jersin explains. “These people might not be exposed to the same opportunities. And that’s really the impact we’re talking about here.

Men also include more skills in their resumes at a lower skill level than women, and they often engage more aggressively with recruiters on the platform.

To resolve these issues, Jersin and his LinkedIn team build a new AI designed to produce more representative results and rolled it out in 2018. It was essentially a separate algorithm designed to counter recommendations biased in favor of a particular group. The new AI ensures that before returning matches organized by the original engine, the recommendation system includes a uniform distribution of users by gender.

Kan says Monster, which lists 5-6 million jobs at any given time, also incorporates behavioral data into its recommendations but doesn’t correct bias the way LinkedIn does. Instead, the marketing team focuses on signing up diverse users to the service, and the company then relies on employers to report and tell Monster whether or not they forwarded. a representative set of candidates.

Irina Novoselsky, CEO of CareerBuilder, says she is focused on using the data the service collects to teach employers how to eliminate bias from their job postings. For example, “when a candidate reads a job description with the word ‘rockstar’, there are significantly fewer women applying,” she says.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *