Wondering if your mind is prone to making endless shortcuts? The kind that lead us to the wrong answers?
Nobel Prize-winning psychologist Daniel Kahneman tells as much to Inc. magazine:
If I tell you, 'Here is a leader of a nation and she's intelligent and strong. If I ask you at this point if she's a good leader, you'll have an answer: She's a good leader. But the third word could be corrupt, and I haven't told you anything about character. You were not waiting. You took the information that you had and made the best story possible out of it. That's the way a mind works.
This is because, as Kahneman unpacks in his must-read Thinking, Fast and Slow, we're wired this way—to construct stories out of imperfect evidence. The data we have could be slight, partial, or biased, but we automatically make the best story possible.
The venerable psychologist explains that this is because of our antipathy toward effort.
"It's not necessarily that we don't like to work, but when there are two ways of doing the same thing—one easy, one hard, we naturally gravitate to the easy way, that's the right answer," he says. "So the bias toward finding the easy way, means that sometimes we pick the easy way and we get to the wrong answer."
One of the easiest ways to implement moves is to track our decisions. Tech companies traffic in their users' data, so it would be natural for us to have greater data about our own behavior. That can start with keeping a decision-making notebook—if every time that you come to an important decision you map out the inputs, Kahneman says you'll better see the incompleteness of the story that you're telling yourself.
Another tactic is to reframe your questions. We almost never do this, Kahneman says. Take a cut of meat, for instance.
. . . Would it be more attractive if I described it as 80% fat free than if I described it at 20% fat? Suppose I described it to you as 80% fat free. Nobody thinks, What if I described it at 20% fat? Nobody does that. You take the formulation you have and you live with it.
What we need to do, then, is to acknowledge that we're bringing all sorts of unexamined assumptions into any situation, butchers or not. Since we all carry biases within our biases, we need to apply an open, innovative mind in examining our blind spots—the start of design thinking.
Finally, but not exhaustively, we need to be aware of the way our incentives shape our biases. Dan and Chip Heath have examined the way they backfire in the pages of Fast Company:
Take Merrill Lynch. In the book Riding the Bull, author Paul Stiles describes his experience as a new trader at the venerable investment bank. Merrill wanted Stiles, then 29, to trade complex international bonds in volatile markets. He tried asking advice of the seasoned traders, but they ignored him—a minute spent helping Stiles was a minute spent not adding to their monthly bonuses. They kept barking into their phones for hours at a time and yelled at Stiles every time his shadow fell across their computer screens. Eventually, Stiles was reduced to silently observing their behavior from a distance, like a rogue MBA anthropologist. It surely never dawned on the person who set up Merrill Lynch's incentive system that the traders' bonuses would make training new employees impossible.
The Bottom Line: While we tell ourselves the simplest stories about our situations, life—and innovation—is much more messy.
[Image: Flickr user Ralph Hockens]