Can Artificial Intelligence Wipe Unconscious Bias From Your Workday?

There’s a plethora of tech meant to make recruiting fairer. Joonko wants to help ordinary workers pinpoint bias throughout the workday.

Can Artificial Intelligence Wipe Unconscious Bias From Your Workday?
[App Photo: Marvel Comics (Fantastic Four #291)]

Unconscious bias is exactly what it sounds like: The associations we make whenever we face a decision are buried so deep (literally–the gland responsible for this, the amygdala, is surrounded by the brain’s gray matter) that we’re as unaware of them as we are of having to breathe.

So it’s not much of a surprise that Ilit Raz, cofounder and CEO of Joonko, a new application that acts as diversity “coach” powered by artificial intelligence, wasn’t even aware at first of the unconscious bias she was facing as a woman in the course of a normal workday. Raz’s experience coming to grips with that informs the way she and her cofounders designed Joonko to work.

The tool joins a crowded field of AI-driven solutions for the workplace, but most of what’s on the market is meant to root out bias in recruiting and hiring. Joonko, by contrast, is setting its sights on illuminating unconscious bias in the types of workplace experiences where few people even think to look for it.

Learning To See The Invisible

Before launching Joonko, Raz counted herself in that cohort. As a woman in the tech world, she’d spent eight years working in the male-dominated intelligence community, and with early-stage startups. Raz would “usually be one out of two women,” she recalls, “maybe alone,” noting that if there was another woman, she was typically in marketing or some other less “technical” role.

A couple years ago, Raz joined a professional women’s group in Israel, where the members held jobs in product management, user experience, and the like. “Half of the time we talked about professional stuff, and half of the time we talked about gender biases in the workplace,” she says. Raz recalls dropping a bomb during the first meeting she attended.

“I came in and said there are no biases. I got whatever I wanted until now,” she’d reasoned to the other women in the group, “and I’m not going to come if I’m going to come here and hear everyone crying about not getting promotions.” The group urged Raz to stay–for two reasons, one for her and one for them. First, they believed it was important for the other women in the group to see that a female tech worker had risen through male-dominated ranks, and second, for Raz herself to understand that bias is more subtle than she’d imagined.

Eventually, “I had this aha moment,” Raz says. More than glaring discrimination like getting passed over for a promotion or enduring sexual harassment, Raz came to identify the so-called small stuff as evidence of bias, like an off-color joke “or who speaks in a meeting, whose opinion you pick at the end, or who gets the most critical task when you work on something.”

That led quickly to a second “aha moment”: Noticing that she too had been exposed to these small but daily occurrences triggered the realization that this was a bigger issue that no one was effectively tackling–in tech or any other industry.

What Bias-Detection Tech Is Already Doing–And What It Isn’t

It’s not for lack of spending. A lot of money and time have gone into diversity and inclusion initiatives and training. One estimate put the number at $8 billion back in 2003, and that was before Intel’s $300 million diversity pledge, Google’s $150 million investment, and the subsequent contributions of other companies toward fixing these issues. An entire industry to deal with the lack of diversity just in Silicon Valley has sprung up alongside these efforts.

But so far, a lot of these resources have been focused on addressing the hiring process. An integral part of the problem, after all, is getting enough diverse candidates in the recruiting pipeline so they can be considered for jobs. Apps like Blendoor hide a candidate’s name, age, employment history, criminal background, and even their photo so employers can focus on qualifications. Interviewing.io’s platform even masks applicants’ voices. Text.io uses AI to parse communications in order to make job postings more gender-neutral. Unitive’s technology also focuses on hiring, with software designed to detect unconscious bias in Applicant Tracking Systems that read resumes and decide which ones to keep or scrap based on certain keywords.

But as Intel recently discovered, hiring diverse talent doesn’t always mean they’ll stick around. And while one 2014 estimate by Margaret Regan, head of the global diversity consultancy FutureWork Institute, found that 20% of large U.S. employers with diversity programs now provide unconscious-bias training–a number that could reach 50% by next year–that training doesn’t always work as intended. The reasons why vary, from companies putting programs on autopilot and expecting them to run themselves, to the simple fact that many employees who are trained ultimately forget what they learned a few days later.

Joonko doesn’t solve these problems. “We didn’t even start with recruiting,” Raz admits. “We started with task management.” She explains that when a company finally hires a diverse candidate, it needs to understand that the best way to retain them is to make sure they feel included and are given the same opportunities as everyone else. That’s where Joonko sees an opening.

Finding A Foothold In Daily Workflows

“We try to catch these ‘micro-events,’” says Raz, and point them out to managers and workers immediately. Raz and her cofounders, Guy Grinwald and Elad Shmilovich, named Joonko after Junko Tabei, a Japanese mountain climber who became the first woman to reach the summit of Mount Everest, and who died last year.

Joonko is aimed at companies and individual managers, not recruiting firms and hiring managers. The application uses artificial intelligence and machine learning to help people become more aware of how unconscious biases were shaping their workplace and their teams on a daily basis. Then it also helps them change their own behaviors in real time on the project management and communication tools they already use, like Trello, Asana, or Slack.

Joonko got its start in beta as part of Techstars Atlanta in September 2016. Since then, cofounder Shmilovich tells Fast Company that the platform has analyzed more than 103,000 cases, and several companies have participated in the free pilot project. Raz is staying mum on which companies are currently using Joonko, but she says she’ll be able to announce the names soon.

This month Joonko is adding a “personal use” version for sales managers using Salesforce or developers using Jira who want to support diversity and inclusion directly with their teams. Raz says individual managers kept approaching them through word of mouth and asked if they could have the solution individually, without having to get the entire enterprise on board.

The way Joonko works is straightforward. The AI analyzes salespeoples’ performance based on total experience, success rate, and tenure at the company. The manager will get a notification alert encouraging them to offer opportunities to someone who may be getting overlooked, and it’s designed to send an email as soon as the right opportunity arises. Managers may also get alerts when one person on their team is consistently assigned fewer or less critical tasks due to unconscious bias.

For employees, Joonko triggers a different set of notifications. For instance, when they’ve reached a milestone or completed a project, the service prods them to promote the results with their manager and team members. As more people use the service, Joonko will be able to measure the effectiveness of these recommendations on all sides.

For now, the cost is just $3 per month to start, with more advanced plans in the $6–$9 per month range. Shmilovich says the company is keeping the cost low on purpose. “It’s nothing sales managers can’t spend from their pocket.”

The Next Set Of Hurdles

One potential pitfall is that while AI is smart, it can’t always discern context cues. And for now, Joonko isn’t examining language. Shmilovich asserts, “The system analyzes behavior patterns of the manager, so it’ll be able to recognize the context of actions and decisions” in real time, which is Joonko’s primary focus.

For the tool to work well, it will need to adapt to some pretty idiosyncratic work scenarios. Joonko will launch a feedback option later this year, letting managers explain why certain things happen that don’t indicate unconscious bias, despite the platform wanting to flag it. For example, says Raz, a manager may be giving an employee fewer tasks because they specifically requested that.

Joonko may eventually expand into the annual performance review, another area fraught with behaviors and language that tend to marginalize underrepresented groups. For now, though, says Raz, “We actually want to help companies overcome daily biases.”

It’s a great place to start.

About the author

Lydia Dishman is a business journalist writing about the intersection of tech, leadership, commerce, and innovation. She is a regular contributor to Fast Company and has written for CBS Moneywatch, Fortune, The Guardian, Popular Science, and the New York Times, among others.

More

Video

More Stories