advertisement
advertisement

We need to talk about the science behind implicit bias training

Implicit bias trainings are rarely effective. Here’s why we need a more comprehensive approach to diversity trainings.

We need to talk about the science behind implicit bias training
[Photo: Na Inho/Unsplash]
advertisement
advertisement

In the midst of the chaos of the last presidential debate was an exchange that’s worth revisiting. President Donald Trump and former Vice President Joe Biden spoke about the Trump administration’s policy to limit the use of some kinds of diversity training in Federal workplace—particularly those that involve concepts related to “critical race theory” and “white privilege.”

advertisement
advertisement

Though Biden and Trump only discussed the topic briefly, it’s rare that the topic of diversity trainings get discussed on such a major stage. It’s worth looking more closely at what traditional workplace trainings entail and what is needed to actually build equitable workplaces, from my perspective, as a psychology professor.

The goals of creating a diverse and inclusive work environment are important and benefit organizations that employ a diverse workforce. Furthermore, there is a general recognition that bias in the workplace can decrease the diversity of the workforce in an organization and can also limit the degree to which employees feel free to express themselves fully at work.

As a result, organizations have turned to a variety of programs to combat bias in hiring and in daily work interactions with an eye toward attracting, hiring, and retaining a diverse base of employees. Advances in the field of psychology have played a role in the development of some forms of bias training, but, unfortunately, the results have not always lived up to their promise.

To frame this discussion, let’s start by thinking about what it means to be biased. A bias is a tendency to respond in a particular way in a particular situation. Many of our biases are good ones to have. When faced with a dessert choice between ice cream and jello, I am biased to choose the ice cream, because I think I will like it better. And—on those rare occasions that I choose the jello—I discover that my bias for ice cream was warranted.

At times, though, biases are based on factors that can lead to undesirable outcomes. Gender and racial biases in the workplace work against the goals of diversity and inclusion. That is why de-biasing the workplace is important.

For a de-biasing program to succeed, it is important to understand the psychological processes that support these biases. I will highlight two of them here. One is that people might be aware of their preferences and can articulate the factors that guide their decisions. When someone is aware of their tendency to favor one type of option over others, that is called an explicit bias.

advertisement

For example, I remember specific situations in which I have enjoyed ice cream, as well as situations in which I have had less-than-stellar jello desserts. So, my bias toward for ice cream over jello is explicit.

It is also possible for people to have biases that they not aware of. These implicit biases come in two forms. One that has been studied widely in psychology involves associations between concepts, so that thinking about one concept also creates thoughts about another. These implicit associations can influence how people react to particular situations. A person who has strong associations between white people and good things and between Black people and bad things may not be aware that they have these associations, but they may find it easier to think about situations that are consistent with these pre-existing associations. As a result, the positive aspects of a white applicant’s résumé might stand out more than the negative aspects, while the opposite might happen for a Black applicant’s résumé.

To the person evaluating the candidates, it might feel like they are making an objective decision, because they are not aware of the factors that are driving their attention to the information on the résumé. These associations among concepts can be measured with a number of tests. One of the most popular is the Implicit Association Test.

A second form of implicit bias comes from being unaware of assumptions that you make about others based on your own prior experience. Earlier this year, I helped the University of Texas plan for the Fall semester during the COVID-19 pandemic. Many people wanted the university to offer classes fully online. For many of these individuals, there was an implicit assumption that all of our students have good broadband access in their homes and quiet places to study.

While that may be true for some of our students, there are also students from rural areas with poor broadband access and students who share their living space with an extended family for whom their home environment would have made it difficult to succeed in online classes. Often, people were not aware that they were making this assumption about our students until it was pointed out to them.

A lot of effort has been focused on training that addresses implicit biases. Politically, this kind of training is often more palatable to employees than training that focuses on explicit biases. It does not force anyone to confront beliefs they hold and to work to change them, but instead allows people to hold onto the self-concept that they are basically good people who treat everyone equally—except for some associations that influence their choices beneath their awareness and some gaps in their knowledge. Asking people to confront their own racism is uncomfortable, and leads to a lot of resistance.

advertisement

Unfortunately, there is not a lot of evidence that attacking people’s implicit associations through training has much benefit. For one thing, a review of studies using the Implicit Association test by Bertram Gawronski suggests that most people are aware of their biases—even when they involve associations between concepts. So, the evidence that people are truly unaware of their own biases is weak.

For another, a comprehensive analysis of techniques for reducing the influence of implicit associations by Calvin Lai and colleagues found that there are some techniques that can have an initial influence on people’s performance on the Implicit Association Test, but these effects are short-lived. So, it is unlikely that there are easy ways to train people out of these implicit associations.

Instead, a more comprehensive approach to bias needs to be taken. It is valuable to teach people about factors they are not currently considering in the decision process that ought to play a role. Once people became aware that many university students would not have sufficient broadband access or study spaces to be able to learn from home effectively, that changed people’s opinions about the value of reopening the university campus.

Organizations also need to think about the strengths and weaknesses of different internal procedures for reducing biases. Some organizations have tried to remove information about gender, race, and ethnicity from the initial evaluations of candidates. That can help reduce biases, but may not be effective when the organization is explicitly trying to promote people from under-represented groups within the organization.

Ultimately, short-term training is unlikely to be the solution to systemic bias. Organizations that are serious about diversity and inclusion must explicitly hire a diverse base of employees, provide mentoring of future leaders, and seek out opportunities to enable more women and people of color to take on key roles.

They must also be willing to make unpopular decisions. When there are many qualified candidates for a role, hiring and promoting members of under-represented groups can create tension among people who feel like they were passed over. And that means that a key part of training involves helping firms articulate their goals in a way that keeps the entire team motivated. Only by taking this more comprehensive and holistic approach can true progress be made.