Quick decisions save time and energy, but sometimes those knee-jerk reactions lead to bad choices. That’s because biases impact our thinking every day, but few of us even know they exist, says Norma Montague, assistant professor of accounting at Wake Forest University in Winston-Salem, North Carolina.
“The word bias has a negative connotation, but it’s most often unintentional and a result of heuristics–mental shortcuts that allow people to make quick, efficient decisions,” she says. “Good decisions are often the result, but not always.”
Biases work well because they’re often systematic and predictable, but problems arise when individuals habitually rely on this method of decision making, excluding or ignoring additional information. Montague, whose research on the topic has been published in the Journal of Accountancy, gives the example of someone who lives in New York City: “There are a lot of one-way streets, and natives accustomed to the traffic flow are being efficient if they look only to the right for oncoming traffic,” she says. “If we were to take that New Yorker to London where streets run in the opposite direction, their mental shortcut could have a bad outcome.”
While Montague’s research focuses on bias in accounting, her findings apply to any profession. She shares five biases that unknowingly influence your thinking, and how you can avoid making a bad decision as a result:
If you rely on information that is the most readily available to make a decision, you might be missing out on facts or opinions that could make a difference, says Montague.
“Individuals have a tendency to make decisions based on whatever information is easily retrievable to them,” she says. “This can be problematic when making decisions that involve other people, as their information or perspective may differ.”
Availability bias is especially misleading when information is subjective. If you’re asked to evaluate your own performance relative to the performance of others, for example, most people will rate their own contribution to be higher, because that is the information they have most available, says Montague. Avoid this bias by routinely asking for feedback from others before making a decision.
If you’re assessing a situation and you’ve been given an “anchor” fact, you could come to an incorrect conclusion based on its reliability. Montague tested this bias by giving half her class the arbitrary number value of 300 and the other half the number value of 3,000. She then asked students to estimate the length of the Mississippi River. The average response from students who had been given the anchor of ‘300’ was 800 miles, while the students who had been given ‘3,000’ gave an average response of 2,800 miles.
Anchors are a popular tactic in sales, says Montague. “When you buy a car, for example, salespeople deliberately throw out an anchor number, because they know the general population will insufficiently adjust from there,” she says.
Avoid this bias by verifying facts you’re given. And if you are going into a negotiation, Montague says, it’s to your advantage to be the first one to throw out the anchor.
While overconfidence is a personality trait often seen in top executives, it can provide a bias that leads to bad decisions, such as over promising, says Montague.
“Decision makers can overestimate their own abilities to do a task,” she says. “If you’re overconfident and don’t perform, you will let down your team or your company. Interestingly, some say this is a good bias.”
While this bias is more difficult to avoid, it can be helpful to slow down your decision and consult with others on your team to make sure what you’re promising is realistic.
People who only seek evidence that supports their beliefs or expectations will make decisions that are affected by confirmation bias.
“This bias is often used when you’re in a debate and you need facts to support your desired outcome,” says Montague. “The problem comes when disconfirming evidence surprises and weakens your position.”
Avoid confirmation bias by applying professional skepticism to your decisions: “Consider the opposite or explain why your initial assessment could be incorrect,” she says. “This exercise forces you to take the time and mental effort to thoughtfully consider the limitations of your chosen solution.”
The strong desire to make a quick decision can lead to a rush-to-solve bias, but people in a hurry often fail to consider all of the possible data before making their decision. Montague says environmental factors like time and budgetary constraints often put people on a rush to solve.
“If you’re in a hurry, you’re also more likely to fall prey to other biases,” she says. Avoid this bias by slowing down decisions whenever possible. “Awareness is the first step to improving quality of judgment,” she says.