Tech workers are among the most in demand this year, commanding the highest salaries and most career opportunities. And while there is a plethora of job openings, not every potential candidate has the coding chops combined with work experience and a solid business background. But Stephanie Lampkin certainly did.
Like many accomplished tech workers, Lampkin started coding at as a kid. From her first steps into programming at age 13, Lampkin worked her way through an engineering degree from Stanford and an MBA from MIT’s Sloan School of Management. Yet after spending five years at Microsoft, the 31-year old applied for a data analytics role at the company and was told she’d be a better fit for a sales or marketing position.
Why? Lampkin believes she was overlooked because as an African American woman she didn’t fit the typical profile. There had to be a better way for qualified candidates to make it past recruiting gatekeepers operating on the premise that a "good fit" meant hiring someone who looked like everyone else already at a particular company. So she built an app that helps eliminate unconscious bias from the hiring process. It took two years of development, but Blendoor finally launched in public beta on March 11.
Rather than relegate her experience to the anecdotal, Lampkin points to quantifiable evidence that unconscious bias is alive and well, despite tech companies' efforts to diversify their talent pools.
First, she notes that scientists estimate that the brain receives about 11 million bits of information per second, but that the conscious mind can only process about 200 bits. The way the brain works to do this is by connecting new information to preconceived concepts. So, for example, when an applicant for a job is unknown, recruiting and hiring are a ripe environment for people to make decisions based on what —or who—they can relate to best.
Furthermore, Lampkin says this has been borne out by a 2003 study from the National Bureau of Economic Research that found that "job applicants with white-sounding names needed to send about 10 resumes to get one callback," while "those with African American-sounding names needed to send around 15 resumes to get one callback." In other words, per the title of the study: "Are Emily and Greg more employable than Lakisha and Jamal?"
Another, more recent study conducted in 2014 from Stanford and the Paris School of Economics revealed similar findings when applicants with foreign-sounding names applied for jobs in the same pool as those without.
This persists in spite of anti-discrimination laws and measures by employers to increase diversity and a spate of other research that supports the idea that diversity is good for business.
Unlike another bias-busting recruiting platform Jopwell that is openly creating a talent pipeline of underrepresented minority workers, Blendoor is a mobile app that hides a candidate’s name, age, employment history, criminal background, and even their photo, so employers can focus on qualifications.
It works like Tinder in that candidates can swipe right or left and get matched to employers just by virtue of their qualifications. Once the match is made, then the hidden information is revealed. There are already companies signed up to tap into a pool of women, veterans, and underrepresented minorities. It’s free for job seekers to upload a profile, but companies are paying $400 per job listing on the app. "The price drops when companies add 5+ jobs," Lampkin tells Fast Company. The company is not releasing user numbers at this time.
Merit-based profiles tackle the first (arguably, most crucial) part of the vetting process. Yet Lampkin’s personal experience of rejection after eight rounds of interviews could indicate that when a candidate's full profile is revealed, there is still a chance they will be discriminated against. Lampkin says, "The antidote is transparency into recruiter behavior through data."
She explains that Blendoor tracks candidates who are matched, receive a phone screening, an interview, and are hired. "We are providing hiring managers, HR, and other relevant executives data that reveals a scorecard of sorts, which shows if and how certain recruiters are clearly discriminating against certain demographics of qualified people," she says. "The hypothesis is behavior changes when people become more conscious of their actions because they know they’re being watched and measured," Lampkin maintains.
Such conversion data may also help National Black MBA Association members. Lampkin has discussed this with the organization, because of the traffic to the mobile app from companies directly as well as Blendoor’s goal to attract people of color in mid-career. She told The Root, these highly qualified candidates at investment or consulting firms, "may not be aware of the opportunities at big companies."
Lampkin says that it is important that Blendoor is regarded as a tech company that enables companies to find the best talent in such a way that has been proven to increase diversity. "We are not another diversity services company," she emphasizes. "We are connecting companies to qualified candidates in a way that circumvents unconscious bias."
Blendoor, in turn, is collecting "very robust data sets that provide key insights and drive behavior that facilitates diversity and inclusion beyond the initial vetting stage," she says. However, diversity is the by-product, not the product. Unlike the Rooney Rule in the NFL which has been criticized for only getting minority candidates at the beginning stages of the recruiting process, Lampkin underscores the product’s merit-based matching. "[This] undercuts the notion that diversity means lowering the bar or just checking a box like ‘Yes, we interviewed at least one black person for this role,’" she says.