It’s Saturday afternoon in midsummer. A man named Gary Klein sits in a Cleveland fire station, waiting for the next alarm to blare. Klein, 56, is a cognitive psychologist — a cartographer of the human mind who maps how people perceive and observe, think and reason, act and react. He has left the sterile setting of the laboratory, where his peers scrutinize humans as if they were rats in a maze, in order to investigate real people operating in the real world.
Klein and his research team are attempting to crack a mystery that has intrigued psychologists for decades: How do people who work in unpredictable situations make life-and-death decisions? And how do they do it so well? According to decision-making models, they should fail more often than they succeed. There is too much uncertainty and too little time for them to make good choices. Yet again and again, they do the right thing. Klein wants to know why.
At 3:21 PM, the alarm goes off. Klein, an assistant, and an emergency-rescue crew scramble aboard an EMS truck. Three minutes later, they pull up to a house in a suburban neighborhood. A man is lying facedown on the front lawn. Blood is pooling all around him. He slipped on a ladder and pushed his arm through a plate-glass window, slicing an artery. The head of the rescue team — Klein calls him “Lieutenant M” — quickly estimates that the man has already lost two units of blood. If he loses two more, he’ll die.
Even as he leaps from the truck, the lieutenant knows by judging the amount of blood on the ground that the man has ripped an artery. In an instant, he applies pressure to the man’s arm. Emergency-medical procedure dictates that the victim should be checked for other injuries before he is moved. But there isn’t time. The lieutenant orders his crew members to get the man into the truck. As the vehicle races to the hospital, a crew member puts inflatable pants on the victim to stabilize his blood pressure. This marks another real-time judgment call: Had they put the pants on the victim before moving him, the crew would have lost precious seconds.
The ambulance pulls up to the hospital’s ER. Klein looks at his watch: It’s 3:31 PM. In a matter of minutes, the lieutenant made several critical decisions that ultimately saved the man’s life. But he ignored the conventional rules of decision making. He didn’t ponder the best course of action or weigh his options. He didn’t rely on deductive thinking or on an analysis of probabilities. How did he know what to do? When Klein asked him, the lieutenant shrugged and said that he simply drew on his experience.
For Klein, “experience” is not a satisfactory answer. Yet most of the time, that is the only answer that he gets. Even expert decision makers — from veteran firefighters to battle-tested software programmers — are often unable to explain how they make decisions. “Their minds move so rapidly when they make a high-pressure decision, they can’t articulate how they did it,” says Klein. “They can see what’s going on in front of them, but not behind them.”
This age-old enigma pushed Klein to decide, more than 20 years ago, to launch a research company that would do what expert decision makers couldn’t. Klein Associates Inc. has studied men and women working in intensive-care units, Blackhawk helicopters, fire stations, and M-1 tanks. In the process, Klein’s cognitive detectives are gaining valuable insight into how people harness their intuition to help them make decisions under extreme pressure. Klein calls these abilities “sources of power,” a phrase that became the title of a clear-eyed, engaging book, Sources of Power: How People Make Decisions (MIT Press, 1998).
Klein’s investigation into real-world decision making is yielding valuable lessons for businesspeople as well. Over the past 20 years, he and his colleagues have worked with Amoco (now BP Amoco), Duke Power Co., and the world’s largest airlines — as well as with the U.S. armed forces — to help those organizations build faster, better decision makers. Team leaders and foot soldiers of the new economy battle shifting goals, missing information, nonstop confusion, and do-or-die deadlines — and still must make choices. By getting people to narrate their high-stakes decisions — that is, to tell their stories — Klein puts himself in a position to see what they know, and to understand the inner workings of how they make decisions.
Klein told his own story in a conference room tucked into the back of his Fairborn, Ohio office building. With his graying beard, white oxford shirt, and affection for polysyllabic speech, he could easily be mistaken for a university professor. He is courteous, but he is also a fearless thinker. A student of human intuition, he had the courage to bet his career on the hunch that people have grossly underestimated the power of gut instinct.
How to Size Up a Big Decision
“I suppose I was led astray by a book,” recalls Klein. He was working as a civilian psychologist at Wright-Patterson Air Force Base in Fairborn when the philosopher Hubert Dreyfus published the controversial book What Computers Can’t Do: The Limits of Artificial Intelligence (Harper & Row, 1979). “Dreyfus argued that people are injecting meaning into everything around them,” Klein continues. “And because they are active interpreters of their world, their experience cannot be deconstructed into the kinds of rules that will fit into expert systems.”
The book rocked the artificial-intelligence community, which derided Dreyfus as an ignorant outsider. But for Klein, Dreyfus’s argument was a revelation. Klein had been helping the Air Force to develop a training program using flight simulation at Wright-Patterson, and he had noticed that novice fighter pilots were trying to follow the classic decision-making model, which was similar to the one being used to construct artificial-intelligence systems: They used deductive logical reasoning to help them make deliberate choices. But as the trainees put in hundreds of hours of flying time, and as their skills and experience grew, they abandoned the model.
“I had a conversation with an instructor pilot that really stuck with me,” recalls Klein. “When he first started flying, he was terribly frightened. If he made a mistake, he’d die. He had to follow all of these rules and checklists in order to fly the plane correctly, and it was an extremely nerve-racking time. But at some point in his development, he underwent a profound change. Suddenly, it felt as if he wasn’t flying the plane — it felt as if he was flying. He had internalized all of the procedures for flying until the plane had felt as if it was a part of him. He no longer needed any rules.”
Six years after he founded his company, Klein won a major contract from the Army Research Institute, which asked him to study how people make decisions under time pressure and uncertainty. He decided to track firefighters. He moved into a firehouse in Cleveland and started his interviews. But there was a problem: Veteran firefighters said that they never made decisions. They would simply arrive at a fire, look it over, and attack it. Klein was horrified. “Here we’d just won this big contract, and we were focused on members of a community who said that they never made decisions.
“The commanders said fire fighting is just a matter of following routine procedures,” Klein continues. “So I asked to see the book in which all of those procedures were codified. And they looked at me as if I was nuts. They said, ‘Nothing’s written down. You just learn through experience.’ That word — ‘experience’ — became my first clue.
“I noticed that when the most experienced commanders confronted a fire, the biggest question they had to deal with wasn’t ‘What do I do?’ It was ‘What’s going on?’ That’s what their experience was buying them — the ability to size up a situation and to recognize the best course of action.”
Intuition Starts With Recognition
Klein’s breakthrough interview was with a fire commander who often claimed that he had ESP, or extrasensory perception. Klein made no attempt to hide his skepticism, but the commander insisted on telling his story: He and his crew encounter a fire at the back of a house. The commander leads his hose team into the building. Standing in the living room, they blast water onto the smoke and flames that appear to be consuming the kitchen. But the fire roars back and continues to burn.
The commander is baffled by the fire’s persistence. His men douse the fire again, and the flames briefly subside. But then they flare up again with an even greater intensity. The firefighters retreat a few steps to regroup. And then the commander is gripped by an uneasy feeling. His intuition (he calls it a “sixth sense”) tells him they should get out of the house. So he orders everyone to leave. Just as the crew reaches the street, the living-room floor caves in. Had they still been inside the house, the men would have plunged into a blazing basement.
Klein realized that the commander gave the order to evacuate because the fire’s behavior didn’t match his expectations. Much of the fire was burning underneath the living-room floor, so it was unaffected by the firefighters’ attack. Also, the rising heat made the room searingly hot — too hot for such a seemingly small fire. Another clue that it was not a run-of-the-mill kitchen blaze: Hot fires are loud, but this one was strangely quiet — because the floor was muffling the roar of the flames that were raging below.
“This incident helped us understand that firefighters make decisions by recognizing when a typical situation is developing,” says Klein. “In this case, the events were not typical. The pattern of the fire didn’t fit with anything in the commander’s experience. That made him uneasy, so he ordered his men out of the building.”
After many more interviews with veteran firefighters, Klein developed a radically different understanding of how intuition might work. Over time, as firefighters accumulate a storehouse of experiences, they subconsciously categorize fires according to how they should react to them. They create one mental catalog for fires that call for a search and rescue and another one for fires that require an interior attack. Then they race through their memories in a hyperdrive search to find a prototypical fire that resembles the fire that they are confronting. As soon as they recognize the right match, they swing into action.
Thought of this way, intuition is really a matter of learning how to see — of looking for cues or patterns that ultimately show you what to do. The commander who saved his crew didn’t have ESP, he simply had “SP.” His sensory perception detected subtle details — small-but-stubborn fire, extreme heat, eerie quiet — that would have been invisible to less-experienced firefighters. “Experienced decision makers see a different world than novices do,” concludes Klein. “And what they see tells them what they should do. Ultimately, intuition is all about perception. The formal rules of decision making are almost incidental.”
The critical role of recognition in decision making came into sharper focus when Beth Crandall, 51, vice president of research operations at Klein Associates, got a contract from the National Institutes of Health to study how intensive-care nurses make decisions. In 1989, she interviewed 19 nurses who worked in the neonatal ward of Miami Valley Hospital in Dayton, Ohio. The nurses cared for newborns in distress — some postmature, some premature. When premature babies develop a septic condition or an infection, it can rapidly spread throughout their bodies and kill them. Detecting sepsis quickly is critical. Crandall heard dozens of stories from nurses who would glance at an infant, instantly recognize that the baby was succumbing to an infection, and take emergency action to save the baby’s life. How did they know whether to act? Almost always, Crandall got the same answer: “You just know.”
But once again, the more accurate answer was this: “recognition.” By asking each nurse to recall specific details of when she suspected sepsis, Crandall compiled a list of visual cues showing that the baby was in the early stages of an infection: Its complexion would fade from a healthy pink to a grayish green; it would cry frequently, but then one day it would become listless and lethargic; it would feed abnormally, causing its abdomen to distend slightly. Each of these cues is extremely subtle, but taken together, they are a danger signal to an experienced nurse.
“When we reviewed the list of cues with specialists in neonatology, we found that half of the cues had never appeared in medical literature at that time,” recalls Klein. “The head of the unit asked if we would train new nurses. We told her that everything on that list came from her own nurses. She said, ‘It doesn’t matter, we can’t articulate what we see anymore — or how we see it.’ So Beth developed and tested a series of training materials to help the nurses.”
Gut Choice, Best Choice
Still, Klein was troubled by another mystery. Once nurses and firefighters make a decision, how do they know whether their course of action is any good?
He thought he knew the answer after reading a study by Peer Soelberg, who taught a course on decision making at MIT’s Sloan School of Management during the late 1960s. Soelberg advocated a classic decision-making strategy: Identify options, evaluate them, rate them, and then pick the option with the highest rating. For his PhD dissertation, Soelberg decided to test whether his students would use this strategy to determine which job offer they should accept. To his great surprise, Soelberg discovered that his students rejected the very strategy that he had taught. Instead, they made a gut choice. Then they compared other job offers with their favorite, to justify that their favorite was indeed the better offer.
Klein believed that fire commanders use the same tactic. Instead of weighing lots of options, he theorized that they make an instinctive decision — say, to attack a burning house from the rear — and then compare it with alternatives. “I thought I’d come up with a daring theory,” says Klein. “But the fire commanders insisted that they never considered options of any kind. As it turned out, my theory was way too conservative.”
Klein took a harder look at the commanders’ stories and began to understand why they don’t have to compare options. Once they make a decision, they evaluate it by rapidly running a mental simulation. They imagine how a course of action may unfold and how it may ultimately play out. The process is akin to building a sequence of snapshots, says Klein, and then observing what occurs.
“If everything works out okay, the commanders stick with their choice. But if they discover unintended consequences that could get them into trouble, they discard that solution and look for another one. They might run through several choices, but they never compare one option with another. They rapidly evaluate each choice on its own merits, even if they cycle through several possibilities. They don’t need the best solution. They just need the one that works.”
Don’t Deliberate — Simulate
Klein’s interview with the commander of an emergency-rescue crew opened a window into the way that mental simulation works in the real world. The commander is called out to rescue a woman who fell off an elevated highway and landed on the metal struts of a sign that was directly underneath the roadbed. She is dangling there, semiconscious, when the rescue team arrives. The commander has a minute or two to figure out a way to pull the woman to safety.
As two of his men climb out onto the sign, the commander considers using a rescue harness to haul the woman back up to the overpass. But he realizes that his men would have to shift the woman into a sitting position before they could attach the harness, and she might slide off the sign supports.
He comes up with another approach: Instead of trying to snap a rescue harness onto the woman’s shoulders and thighs, his men could attach it from behind. That way, they wouldn’t have to move her before she was secured to a rope. But then he imagines that in lifting the woman, the harness would twist her back and injure her.
Then he comes up with a third idea: They’ll use a ladder belt — a strong belt that firefighters buckle over their coats when they scale ladders. His plan is to slide the belt under the woman, tie a rope around her and to the belt, and then lift her up to the overpass. He thinks his idea through again, likes it, and tells his crew to begin the rescue.
In the meantime, a hook-and-ladder truck arrives. That crew positions a ladder directly underneath the woman. A firefighter scrambles up the ladder just as the rescue commander orders his men to lift the woman using the belt and rope. As they lift her, the commander realizes that he’s made a terrible mistake: The ladder belt is too large for the woman. As the commander put it, “She slipped through the harness like she was a strand of spaghetti.” Luckily, she falls right into the arms of the crewman on the ladder.
Mental simulations aren’t always foolproof, as this case shows. But many times, they succeed. And they are efficient. It took about 30 seconds for the commander to evaluate each choice and arrive at what he thought was a good solution.
“We used to think that experts carefully deliberate the merits of each course of action, whereas novices impulsively jump at the first option,” says Klein. But his team concluded that the reverse is true. “It’s the novices who must compare different approaches to solving a problem. Experts come up with a plan and then rapidly assess whether it will work. They move fast because they do less.”
The More You Know, the Faster You Go
If Klein is right, then organizations that teach decision-making skills by insisting that people generate large sets of options might actually slow decision makers down. Weighing options generally makes sense for novices, who need a decision-making framework to help them think their way through a problem, says Klein. But the way to get people past the beginner stage is to accelerate the growth of their experiences, so that they can rapidly accumulate the memories and the cues that will enable them to make faster, better decisions.
“I’ve been at commercial-airline conferences,” says Klein, “where pilots are given little laminated cards that have acronyms on them like STAR — Stop, Think, Analyze, Respond. It’s a dysfunctional strategy, because in a real emergency, pilots wouldn’t have enough time to use it.”
The best decision makers that Klein has seen are wildland firefighters, who are force-fed a constant diet of forest fires. They fight fires 12 months a year — in the western United States during the summer, and in Australia and New Zealand during the winter — and rapidly build a base of experience. And they are relentless about learning from experience. After every major fire, the command team runs a feedback session, reviews its performance, and then seeks out new lessons. Moreover, the people at the top start at the bottom. The lowest-level crew members know that their leaders have been in their boots and have felt their exhaustion. This breeds trust and confidence all the way down the line.
Marvin Thordsen, 50, a senior research associate at Klein Associates, watched a wildland-fire command staff take only a few days to assemble a team of 4,000 firefighters, drawn from all over the country, to fight a fire in Idaho that had engulfed six mountains. “It’s hard enough to make policy, to give direction, and to manage an intact organization of 4,000 people, even in a safe setting,” says Klein. “These guys created that organization in less than a week — and built in enough trust to risk people’s lives. They knocked us out.”
What does expertise feel like on an individual level? Klein answers this question with a final narrative. There is little drama in this story. No one is at risk. There are no last-minute rescues. It begins with a visit that Klein and his wife made to a county fair soon after they moved from New York to Ohio.
“A friend brought me to where the horses were being judged,” recalls Klein. “She tried to explain the characteristics of a good horse. Over the years, she had learned a lot about these animals, and she could see things that I couldn’t see. She had accumulated all of this knowledge, but it wasn’t a burden. She carried it all so easily. And I remember thinking, That’s expertise. That’s how it gets used.
“We sometimes think that experts are weighed down by information, by facts, by memories — that they make decisions slowly because they must search through so much data. But in fact, we’ve got it backward. The accumulation of experience does not weigh people down — it lightens them up. It makes them fast.”
Bill Breen (firstname.lastname@example.org) is a senior editor at Fast Company. We trust his instincts. Contact Gary Klein by email (email@example.com), or learn more about Klein Associates Inc. on the Web (www.decisionmaking.com).
Sidebar: Premortems: Fix Problems Before They Happen
One of the prime tools used in rapid decision making is mental simulation — the ability to evaluate a course of action by imagining how it may unfold and may ultimately play out. At Klein Associates Inc., work groups use a form of mental simulation called a “premortem” to discover a new project’s hidden flaws.
A premortem works like this: When a team gathers to kick off a new project, people conclude the meeting by pretending to gaze into a crystal ball. They look six months into the future, and the news is not good. Despite their hopes, the project has failed. Then team members take three minutes to run a mental simulation. They write down why they think their work derailed. All sorts of reasons emerge.
“People might say that I pushed the project in my own direction and created complications,” says Gary Klein, the company’s founder and its chief scientist. “Someone else will say that the project was too ambitious — that we should have streamlined it. I might say that the two people who led the project had other big responsibilities, and they blew the deadline.”
The group’s comments are unusually candid. The reason, says Klein, is that the conversation’s context is radically different from a critique. The entire focus is on trying to understand why the project failed. By looking six months into the future, people feel secure enough to say what they really think. Then they snap back to the present. Each comment is recorded, so that all of the members know the potential speed bumps before they go forward.
The exercise helps people work smarter. It keeps them from getting overconfident. And it seems to make sense. “With a postmortem, everything you learn is after the fact,” says Klein. “With a premortem, we give ourselves a chance to uncover problems and then fix them in real time, as the project unfolds.”