More companies are using AI to teach EQ. Here’s how it can go wrong

Technology is enabling better soft skills, but there are some unintended consequences.

More companies are using AI to teach EQ. Here’s how it can go wrong
[Photos: Hammer & Tusk/Unsplash; nortonrsx/iStock; Dragovich148/iStock]

Soft skills are the new career makers—and a host of new technology enablers have cropped up to help identify and optimize these skills for us. Companies see emerging tech such as virtual reality and artificial intelligence as an opportunity for machines to teach us more human skills such as emotional intelligence and agility at work.


The reality is that technology can only go so far. You can be empathetic to someone’s plight but still treat them poorly. The übercompetitive world of tech or finance lures people with fast cognitive processing. This may lead to quick responses that make millions on the trading floor but can cause trouble at the staff party.

Your capacity for empathy and skill at deploying it also waxes and wanes with your own physical and mental state. No VR program can cure the “hangries” when you haven’t eaten. Big data tells us that even judges make softer decisions after lunch. With that in mind, the most effective technological enabler of empathy may, in fact, be Uber Eats.

While technology can be useful, it’s becoming clear that it’s best suited for specific functions, such as finding candidates skilled in empathy and collaboration or training us out of unconscious bias. Many employees feel that they share so much at work already, the idea of sharing additional personal data about their health, behavior, or matchmaking preferences is an invasion of privacy. They would rather just have the company cough up for a Fitbit that’s not connected to the company intranet.

Here are three ways technology is enabling better soft skills—and some unintended consequences.


The behavioral science theory of nudging uses positive reinforcement to influence our actions. Imagine giving a dog a biscuit for sitting still. Governments used this principle to increase savings rates by automating enrollment in workplace pension schemes.

More companies are using the ubiquity of smartphones to nudge us to perform better and discourage us from poor habits that might affect work.


One example of a company that uses technology to enable organizations to develop thousands of people into high-performing leaders is Potentialife, founded by an ex-McKinsey consultant and Harvard University’s leading professor of positive psychology. Working with clients such as World Vision and Barclays, the consultancy sets up programs to help individuals optimize and measure strength, health, absorption, relationships, and focus. There’s an app to remind employees to take action in support of these goals, track them, and aggregate them across the company. For clients, the results can be a common language for peak performance, collective focus, and personal responsibility with an ultimate eye toward profit.

Other companies use tech to nudge us away from destructive behavior.

Based on recurring neural network technology, so-called woebots, which are available 24/7 through voice, text, or app-based chatbots, ask questions to prompt better choices or perspective. No more sitting down with emotional, entry-level staff after the latest breakup. There’s an app for that.

If giving staff a supercharged version of a healthy workplace app keeps morale up, then it seems like something worth trying. It is also a good way to reinforce learning and development through repetition and reinforcement.

However, if you take an informal survey, you’ll find that most people would prefer to define their health and relationship goals without the participation of the HR department. This is particularly true when you consider the impacts on insurance or privacy for individuals with health issues. Data is the fuel that many of these technologies run on, but privacy issues loom large.

AI-based matchmaking

With over half of all couples now meeting online, companies are considering how they can use similar tech to match us with colleagues at work.


It’s easier to teach technical skills than teach people how to work well together—that’s the theory behind staffing teammates who are algorithmically appropriate. It can also be a good option for teams that need to form for short periods of time.

Airbnb, for example, uses AI to pair people into minitribes at their group gatherings. Global education company iTutor uses eye tracking to sense the engagement of students so as to match them over time with specific types of teachers and other compatible students for group work. They found that some students just respond better to specific teachers.

Managers and staff shudder at this level of measurement. We aren’t ready for AI-generated scores for our likeability, or to forfeit blaming others for faulty team dynamics.

Dating apps aside, there is also distrust in letting machines make decisions around human relationships. If the algorithm has selected a dream team but you find it dysfunctional, how do you resolve the issue in a way that makes both the humans and the algorithms happy?

What AI-based matchmaking doesn’t do is take diversity and inclusion into consideration. If we leave talent acquisition to staff teams based on dimensions of compatibility, that will streamline our ability to produce efficient output but reduce our ability to maximize creativity or problem-solving using a variety of perspectives. To prepare for unpredictable events and global challenges on the horizon, learning to work well with people vastly different from us will be a crucial human skill to cultivate.

Virtual reality

It’s only a short step to imagine that we can use enhanced technologies such as virtual reality to develop what’s deemed the right behaviors in employees.


NGOs use VR’s immersive storytelling powers to inspire empathy and behavioral change. Stanford’s Virtual Human Interaction Lab tests indicate that we will save more if we experience ourselves as elderly. Leadership trainers are considering how we might erase unconscious bias by literally letting you walk a mile in another person’s shoes, via the immersive nature of a VR experience. Companies such as Equal Reality have worked with everyone from Procter & Gamble to the Australian Department of Defense to give staff a sense of what it feels like to experience discrimination firsthand.

VR has been called the “ultimate empathy machine” by Chris Milk in his TEDx talk. But what happens when, overwhelmed with the many nudges of tech-enabled behavioral science, you become numb?

Professor Erik Ramirez at Santa Clara University worries VR will cement unconscious bias if we use new tech to put ourselves in another’s shoes and think, “It’s not so bad.” However, Rick Martin, cofounder of Equal Reality, says VR ”uncrosses arms and creates a common shared experience.” When people can feel and empathize more, this leads to “a deeper dialogue, where people are more motivated to engage in topics.”

Cambridge Analytica’s social media profiling of voters shows us how ad targeting without awareness can be manipulated. The key in all of this is ethical oversight of who decides what is “right” and what is anti-social behavior. Staff needs to be educated about the ultimate goals of a training and to give permission before taking part.

Room to learn

Many HR professionals conclude that technology is a nice enabler of the new ways of training, but, at the moment, it’s best focused on very specific recruitment and training needs with clear boundaries. Virtual first aid, factory training, or collaboration software for projects all have practical uses and little downside.

One day, empathy may evolve into an actual job category with specific training and enhancement, as in the “empathy tracker” protagonist of the near-future thriller Last Tango in Cyberspace.


For now, the best way to enhance soft skills is to invest less in technology and more in ourselves.

Diana Wu David is the author of Future Proof: Reinventing Work in the Age of Acceleration.