Apply for a job at Hilton International and three different computer systems have to approve your application before a human being will look at it. That’s the process Sarah Smart, vice president of global recruitment, outlines as she explains how the hotel chain uses artificial intelligence to weed out thousands applying for work in customer care:
First, an applicant tracking system searches people’s resumes for keywords matching the job description. Next, a chatbot asks them a series of yes or no questions to make sure they meet requirements, like, Do you have internet access for work-from-home positions? The ones who say yes get interviewed. But not by a person: This interview is with a predictive AI application called HireVue.
Founded in 2004, the Salt Lake City company has raised more than $90 million in venture capital investment. “It’s a one-way interview,” says Smart. “The candidate will receive anywhere between five to seven questions that they get a chance to answer.” Hilton and HireVue develop these questions together, but they’re all designed to determine whether you’ll be a friendly worker, an empathetic one, and if you’ll be successful in the role. Applicants record their answers inside HireVue’s video platform, then the algorithm gets to work. It breaks down how many prepositions you use, and whether or not you smile. Chief technology officer Loren Larsen says the tool can examine around 25,000 different data points per video, breaking down your words, your voice, and your face.
“We have to start with what success looks like in the job,” Larsen says, explaining how each candidate’s analysis connects to individual work capability. Take customer service, he continues. “What are the competencies or skills or abilities or traits someone has to have to do well in that job? And then once you identify those, then you start to figure out, ‘Okay, let’s suppose friendliness is a trait.'” And when people are friendly, they smile, so that’s why the system studies candidates’ faces.
But should that smile–and how many times you show it–be what determines whether you get the job?
Hilton says yes. As Smart explains, “It’s a pass/fail.” Since becoming a HireVue client in 2014, 43,000 job seekers have interviewed with the algorithm. Two-thirds–roughly 28,667 people–had their applications rejected without being seen by a single person. (As hers is a corporate role, Smart herself did not go through the system.)
But according to Larsen, that’s not how the technology is really supposed to be used. As a company, he explains, “We’re never saying pass/fail. What we do return is a score that is essentially like your SAT score”–a percentile ranking of how each individual measures against others who applied at around the same time. Because so much data goes into compiling each analysis, no single smile or frown should keep you from getting work; it would just raise or lower your score by a few points. And only 10% to 30% of that score–depending on the employer–comes from facial expressions. The rest is based on the language you use.
Both Smart and Larsen contend that the AI isn’t considering anything a human recruiter wouldn’t, with Smart calling the analyses “more sophisticated than that natural sort of gut instinct feeling” that people get when they interview someone.
“If I walked in and you were interviewing me for a job,” Larsen says, “you’d be paying attention to the same things,” such as, “Are you friendly? Are you a good representation for our brand?” At least with an algorithm, he contends, recruiters work with a larger data set.
Of the 25,000 data points HireVue measures, communications director Cynthia Siemens says most employers don’t analyze them all. Hilton doesn’t look at the same behavioral indicators as Unilever, for example, another HireVue client. “Which data points are included in the finished product depends upon what our testing proves to be relevant to success in a specific job role,” she says, with five industrial-organizational (IO) psychologists overseeing development.
Currently, candidates are not given copies of their results, a feature Larsen hopes to add “in a couple months.”
To the 28,000-plus Hilton applicants rejected without human involvement, these reports may be helpful, especially if they don’t have the same background as existing employees. Asian immigrants, for example, use facial expressions differently than white Americans, which could skew results in the system. Also, HireVue’s language analysis begins with speech-to-text conversion, a technology notorious for misunderstanding Southerners.
Larsen says one of HireVue’s principal goals has always been to remove bias: Through providing a larger behavioral data set than any human recruiter could amass on his or her own, the goal is to take personal prejudice out of the equation. This, he notes, is part of why HireVue recommends clients not use results to make hard cuts like Hilton does.
“We have to be very, very careful about adverse impacts,” he explains, noting that if a client does eliminate every job applicant who scores below the 90th percentile, for example, “It’s very possible that they would not have enough, let’s say, women or African Americans or Latinos in the pool to have a balanced slate in only that top 10%.”
Smart says, “We’ve developed our profile based on looking at successful candidates inside of Hilton,” which she adds is “incredibly diverse.” As per the company’s 2018 diversity press kit, 69% of its U.S.-based employees are ethnic minorities. In its use of HireVue, Smart says Hilton has “avoided any potential adverse impact that could come from a selection tool that may be biased one way or another.”
For those 43,000 people who have gone through the system, Smart says one positive side effect is that they all found out whether they got the job more quickly: “From a business perspective, HireVue has allowed us to significantly reduce our days to fill, which–when you talk about filling large, large classes of representatives that are working with our guests on a daily basis–is a huge business opportunity for us.” This opportunity isn’t just extended to people the computer rejects.