Did you know that using phrases like “a proven track record” on job postings result in more male applicants, whereas “a passion for learning” attracts female applicants?
These findings are according to the startup Textio that launched last year and recently raised $1.5 million for its software that promises to spot gender bias in job descriptions and performance reviews. Companies like Textio are becoming a big business, particularly in Silicon Valley, where the percentage of underrepresented minorities is so low, employers shouldn’t trust their own judgment anymore. After all, if we learned nothing else from Ellen Pao’s landmark gender discrimination case, we know that inequality is never as black and white as some of us believe.
Unconscious bias, often referred to as “second-generation discrimination,” can be as subtle as the language used to describe men and women during performance reviews, as tech company Kanjoya discovered with its emotion-aware language-processing technology. For example, Kanjoya’s technology, which has been developed in over eight years in collaboration with linguistics experts from Stanford University, finds that the word “assertiveness” is used to describe women negatively in reviews, but is correlated with positive reviews and promotions for men.
Armen Berjikly, founder and CEO, tells Fast Company his company spent years compiling data before their technology was actually developed–and that’s what gives Kanjoya its “precise” ability to “read between the lines of what someone’s trying to communicate.” Its tools uncover subtle differences in opinions, attitudes, and sentiment in conversations and can detect the earliest signs of bullying, harassment, and discrimination. Berjikly says Kanjoya’s technology is also able to recognize human emotions and intent in language.
After building the technology, Berjikly realized there are two areas where it would have the most value. “One is around your customers and the experience they’re having with your brand, which is enormously emotional,” he says. “The second place is the internal world, which looks at how emotional the decision is to keep working somewhere, or to work really hard, or to care more than you should or less than you should.”
“When we looked at those two worlds, the area that we thought we could make the most impact, where people were the least understood, but yet affected the biggest part of their lives was this employee world,” explains Berjikly. “We spend, at least in my world, more time at the office than we do at home, with our coworkers than we do with our families. And there’s not a person that we know who isn’t feeling like they’re not totally understood at work or that their manager, their company doesn’t get them.”
So far, it seems Silicon Valley has embraced Berjikly’s technology. Founded in 2007, the company has raised $20 million in venture capital from investors including D.E. Shaw, Floodgate, and SV Angel, and its clients include Cisco, Twitter, and Genentech.
Another company hoping to solve tech’s diversity problem, Unitive.works–set to launch in June–will be using its technology to flag and map unconscious bias in real time. As a Silicon Valley veteran, founder Laura Mather is no stranger to being in a room with little diversity. As a result, Mather says she wanted to use her experience in “developing enterprise software with a twist in human behavior analytics” to solve unconscious bias.
Unitive’s technology focuses on the recruiting and hiring process and has the ability to disrupt unconscious bias in the moment when it happens, such as when someone is writing a job description or screening resumes or conducting an interview and includes biases that aren’t relevant to whether someone can do the job.
“What is also great about how companies hire today is these processes are already completely operationalized through technology,” says Mather. “Candidates apply online, corporations track applicants online. What Unitive does is layer on top of these Applicant Tracking Systems (ATS) by integrating with available APIs to detect, interrupt, and record when biases occur.”
As more companies set out to solve our modern-day diversity problem, other companies, like Google and defense contractor BAE, are trying to do it on their own with internal training programs aimed at helping employees become “more aware” of unconscious bias. While giant companies may have the funds to develop their own programs, Joelle Emerson, CEO and founder of strategy firm Paradigm, doesn’t recommend that other companies try and spot unconscious bias on their own. After all, they’re called “unconscious” for a reason.
“We have this big body of research that tells us the things we should be doing, and I don’t think a lot of these companies are doing these things,” says Emerson. “I don’t think it’s their fault. I think there’s a big disconnect between the academic world and the world of practitioners.”
As a former sexual harassment lawyer, Emerson saw an opportunity to bridge the gap between that disconnect when she began seeing patterns in cases where “if a few things had gone differently earlier in the life cycle of the [problems], the [cases] wouldn’t have had to happen,” she says. Since launching in November 2014, Paradigm has worked with 11 companies to help promote diversity.
After years of getting away with ignoring the diversity problem, companies are now finally forced to take a long, hard look at how their biases impact their decision-making process. According to an article in the Wall Street Journal, as many as 20% of large U.S. companies are providing unconscious bias training to their employees, and this percentage can reach 50% in the next five years. We know everyone has hidden biases. No one is blaming you if you didn’t know male managers used the words “assertive,” playful,” and “funny” in reviews right before demoting their female employees, yet promoted their male employees when using the same words. But as more companies exist now to help uncover unconscious discrimination, there’s really no excuse in stopping tech from solving its damaging diversity problem.