Love them or hate them, online ads are a fact of life in most job hunts. Big data is watching your progress as you click between media, and serves up advertisements based on what the algorithm perceives are your preferences. Designed to be personal, yet controlled by a machine, what could be closer to accurate? Yet it seems as though even the software is biased.
We already know that researchers uncovered a hidden gender bias in Google’s image searches. That study found that when people looked for images to represent careers and jobs, they disproportionately represented women. For example, while more than half of the book authors in the U.S. are women, according to Google image search it’s only about 25%. Images of telemarketers incorrectly tipped the other way at 64%.
Now researchers at Carnegie Mellon University analyzing Google’s job advertisements found a similar bias. By creating 1,000 simulated users—split evenly between male and female—and having them visit 100 top employment sites, the researchers found that male users were shown high-paying job ads about 1,800 times, compared to female users who saw those ads about 300 times. In contrast, the female profiles were more likely to receive ads from a generic job-posting service and an auto dealer.
The researchers pointed out that they have no evidence that Google is doing anything illegal or that it violates its own policies. But on the employer end, those with automated recruitment tools might be unwittingly dipping into trouble. The EEOC specifically prohibits discrimination based on gender or sexual orientation. Not to mention that automation might disqualify a candidate who is best suited for the position.
Jim DelGaldo, a business systems analyst at iCIMS, a talent acquisition solutions provider, says that biases in talent acquisition software can arise for a variety of reasons and affect important business metrics such as volume of applications, rates of hire, and cost per hire, among other factors.
Although applicant tracking systems are designed to help companies reduce the risks associated with hiring bias for age, race, sex, etc., sometimes they don’t work. DelGaldo says that eliminating a machine-made bias has to be rooted out during the software development stage by making sure that the data sample is large enough to be fairly analyzed against company and industry benchmarks.
“In order to truly “compare apples to apples,” software developers have to apply some elements of data science to normalize the information,” he says. Normalization is the key to consistency as it takes different terms or titles that mean the same thing, like a certain role or skill, and matches them into a cluster of data and eliminates inconsistencies, DelGaldo explains.
Beyond that, the most common biases arise when job boards such as Indeed, CareerBuilder, and Monster are the source of the majority of a company’s candidates, according to iCIMS’s research. “Though they may provide a huge pool of potential applicants, they don’t always give you the full story at first glance,” he says.
DelGaldo advises recruiters to do comparative testing to ensure that the quality of the candidates is proportional to the quantity. “By testing against other sourcing avenues such as industry-specific job boards, social media, and employee referrals, recruiters can ensure an accurate depiction of a candidate and increase the likelihood that he or she will be a good fit based on the source of hire,” he maintains.
But the potential problem needs to be tackled from the applicant’s side, too. DelGaldo recommends that recruiters take a page from the Carnegie Mellon University researchers’ playbook and test the system from a candidate’s perspective on a regular basis.
“By testing the process as a potential job seeker would, recruiters can make sure the candidate experience accurately reflects the company’s employment brand, is mobile-optimized, and is generally easy to complete,” he says. They should also go over all screening questions to make sure the wording doesn’t eliminate candidates who are qualified. For example, needing a certain number of years of experience, he suggests.
DelGaldo believes the best way for recruiters to eliminate bias is to take a proactive and consistent approach. In addition to periodic, objective reviews of the candidate experience, DelGaldo says consistently benchmarking against past internal and industry metrics will help recruiters recognize patterns and anomalies over time. “Willingness to ask the right questions and diversify sourcing methods as needed will help recruiters to maintain a neutral mind-set when making hiring decisions.”