Analyzing The Subtle Bias In Tech Companies’ Recruiting Emails

A close study of word choice in job descriptions and recruiting emails reveals how tech companies are inadvertently hindering diversity.

Analyzing The Subtle Bias In Tech Companies’ Recruiting Emails
[Source photo: via StokPic]

A year after Microsoft CEO Satya Nadella backpedaled from a gaffe at a women’s tech conference and announced a major employee diversity push, Microsoft reported in November that roles for women in its tech positions had actually gone down. At the same time, roles for African Americans and Latinos had barely budged, with them holding just over 6% of tech jobs at the company. Results for other major technology firms, despite their public pledges, aren’t much better. Which can lead to the question: Is all this talk of diversity just empty words?


According to linguist and cognitive scientist Kieran Snyder, empty words in documents like job descriptions could be precisely what’s hurting diversity, by discouraging people from even applying. “When you use language that kinda counts as corporate jargon—synergy being the most hilarious example . . . your job listing becomes less popular,” says Snyder, who is also a former Microsoft engineer and program manager. “Everybody hates that language, but underrepresented people hate it more, probably because it’s a cultural signifier of some kind. It sort of communicates, this is an old-boy’s network kind of company.”

Can AI Solve Unconscious Bias?

Snyder’s conclusions come from crunching the data from her Seattle startup, Textio. Its web-based application, used by companies including Twitter, Microsoft and, applies a form of artificial intelligence (AI) called natural language processing (NLP) to study the verbiage in documents. Resembling a spelling and grammar checker, it flags words and phrases that are cliché, gender-biased, or otherwise offputting as someone types them. Textio started with a product for writing job descriptions, because there is plenty of concrete data on how they perform, such as how long it takes to fill a job and how many people apply.

Textio’s web-based word processor highlights words with special meaning.

The Surprising Problematic Words

Common stereotypes in business might be that men are better leaders or that women are emotional or harsh. Snyder highlighted some of that in an August 2014 Fortune article “The Abrasiveness Trap,” in which she analyzed employee reviews and found words like bossy, abrasive, strident, and aggressive popping up in women’s evaluations.

Related: The One Word Men Never See In Their Performance Reviews

Clichés can signify that a business is set in its ways, says Joelle Emerson, founder and CEO of Paradigm, a company that helps clients like Airbnb, Pinterest, and Slack improve their employee diversity policies. “Companies . . . embody that mind-set in the language that they use,” she says. “Because your mind-set is fixed and not variable, you may be more likely to rely on stereotypes.” (Emerson uses Textio with clients and is on the company’s advisory board.)

Related: Tech’s Big Gender Diversity Push One Year In


Until recently, the exclusive language was blatant, says Emerson. “A year ago, two years ago certainly, we saw tons of job descriptions in the tech industry using male pronouns,” she says. “That’s crazy.” But just plain cheesy language is also offputting. Companies may say they are looking for a rock star or ninja to join their team, and white men are more likely to identify with or aspire to such descriptions, says Snyder. Underrepresented people, instead, feel they still have to prove themselves in the job—to break negative stereotypes, says Emerson.

To back this up, Emerson cites an influential 2014 study with the ungainly title, “A Company I Can Trust? Organizational Lay Theories Moderate Stereotype Threat for Women.” One of the main findings: Companies that emphasize employee growth and development are more appealing/less threatening to women than ones that boast about hiring people who are already awesome.

The Importance Of Language In Recruiting Emails

Textio initially marketed its program only for analyzing job postings, but Snyder found that about 10% of all text entered into the app is emails, mostly from job recruiters. Those entries provided Textio enough data to formally expand its support to recruiting emails. Here, her analysis shows, cliché language is an even bigger turnoff. Among the annoying kinds of phrases: “I’d love to invite you to this event to network,” “I came across your profile,” “candidates like yourself,” and “Is it time for you to make a change?” People expect emails to be more personal, says Snyder, so jargon is even more annoying here. “People who use corporate jargon are traditionally white and male,” she tells me, in a follow-up email, “simply because people in the corporate world have tended to be white and male.”

The irony is that recruiters are targeting women and minorities ever more aggressively. “When you’re an underrepresented group—and companies have paid increasing attention to inclusivity in their teams—the demand for your attention goes up,” says Snyder. “I have a friend who is a Latina developer, very strong, great history . . . and she receives a hundred of these outreaches every week.” The messaging isn’t always working, because it may sound too much like recruiters are approaching people specifically because they are women or minorities, rather than based on their skills and accomplishments. “You don’t want to make people feel exceptionally targeted for demographic reasons,” says Snyder.

After the recruiting stage, the barriers to increasing diversity get even higher, says Emerson. “Companies have constructed interview processes that are not actually designed to identify the best candidate,” she says. “They are really designed to identify candidates that are very similar to the people already there.” This takes the form of the “culture-fit” evaluation. Sometimes culture fit is one criteria that interviewers use; other times, it requires a separate interview. The criteria for fitting in are vague, says Emerson, and not the best measure of whether someone will be good in their job.

By coding words used in the notes from interviews her clients have conducted, she’s found that poor language choices play a role here, too. Words and phrases like lack of confidence, nervous, and anxious are often used in feedback about women candidates, but not about men. “Is there a bias,” says Emerson, “or are female candidates actually exhibiting as being more nervous? Which wouldn’t be surprising . . . because a lot of what can happen when you are underrepresented is, you feel nervous.”


Even after the hire, companies can make the same mistake as they do in recruiting if they promote the idea that their employees are rock stars or geniuses. “If you are someone from an underrepresented background, around whom there are negative stereotypes about things like genius and brilliant, you start to question, oh, am I really going to fit in here,” she says. Emerson doesn’t have data to show if the feeling of isolation is bad enough to make people quit, but she says the fear that they don’t fit causes employees anxiety that hurts their job performance—another conclusion from the 2014 Stereotype Threat study.

Language problems continue into performance reviews, says Emerson, mentioning research by Snyder and others that points to stereotyping. Performance reporting is one of the next areas for Textio to expand into, says Snyder, who expects to have a beta version early next year. “How do you write performance feedback in a way that retains strong people?” she says. “I think it’s going to be very interesting to connect the prehire data with the post-hire data, because then you can really start telling a story.”

Related: How To Start Fixing Tech’s Diversity Problem

About the author

Sean Captain is a Bay Area technology, science, and policy journalist. Follow him on Twitter @seancaptain.