Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

4 minute read


How Too Much Data Can Hurt Our Productivity And Decision-Making

Data findings are only valuable when they reliably change someone’s behavior for the better.

How Too Much Data Can Hurt Our Productivity And Decision-Making
[Photo: Flickr user Idaho National Laboratory]

Wallis Simpson, the American socialite for whom King Edward VIII abdicated his throne, is quoted as saying, "You can’t be too rich or too thin." Today’s business equivalent is, "You can’t have too much data." Using advanced analytics to mine the ever-increasing cloud of digital dust can uncover hidden patterns and generate deep insights. Such is the promise of applied data science.

But the truth is you can have too much data. In fact, sometimes having more data can actually make things worse, leading us to act in ways that can be counterproductive.

Here’s why. As many bright folks have pointed out, data findings that aren't actionable aren't worth a thing. For example, a deep dive into who buys your widget doesn’t generate value unless it helps you focus your sales efforts on better prospects and away from people who will never buy your stuff.

In other words, data analytics is only valuable when it changes someone’s behavior. And this is precisely the problem. Human behavior is fraught with biases and blind spots, so there’s no guarantee that the change brought about by access to additional data will always be a positive one.

The Need To Explain

This concern is not just theoretical. My father-in-law, Ron, for example, underwent abdominal surgery a couple of years ago. About a day after the surgery, his gut hadn’t passed anything so his surgeon ordered an x-ray to make sure his plumbing wasn’t blocked. Everything was fine, but the x-ray showed a clot in one area—also completely normal after this kind of surgery.

A week later, Ron had an infection near his surgical wound. The surgeon ordered another x-ray to see whether the infection had spread internally. It hadn’t, but he did notice that the clot had increased in size. Clots like these are concerning because eventually a piece might break off and cause serious problems. After conferring with a specialist, the surgeon recommended that Ron take a blood thinner for six months. At that point they’d check to see whether the clot had gotten any smaller.

But here’s the problem: No one’s ever done a formal study of whether using blood thinners for patients with clots discovered the way Ron’s was does more good than harm. Instead, nearly everything we know about blood thinners comes from studies of patients whose clots were discovered because they caused some type of symptom. We have absolutely no idea whether clots like Ron’s keep growing, stay the same size, or even shrink on their own.

Instead, the recommendation to take a blood thinner reflects a behavioral hiccup that says more about human psychology than it does about the treatment of blood clots. People—including doctors—like to have explanations for what they do, even if it’s unlikely that anyone will ever ask for one.

It would've been far easier for the surgeon to explain why he treated the clot (e.g., "clots cause problems, this clot looks like it’s growing, and we have a medication that can slow it down and even make it go away") than why he didn’t.

Adrift In A Sea Of Incidental Findings

Ron’s situation was what physicians call an "incidental finding"—an abnormality detected unintentionally while dealing with some other, unrelated problem. They happen every day as the result of getting additional data for which you weren’t actually looking, and they're evidently troubling enough to warrant a report from the Presidential Commission for the Study of Bioethical Issues.

In a sense, nearly everything we learn from big data in business applications is an incidental finding. The data were collected for another purpose, and we’re poking around after the fact to see if we can find consistent patterns. But once we’ve found such patterns, we’re faced with making decisions at a granularity far finer than we’ve comfortably made in the past.

Suppose, for example, that your chief data officer has developed a method for estimating engagement at an individual employee level. She reports that your most highly engaged employees are more physically active, more likely to use their vacation time all at once, slightly more likely to have dress-code violations, and more likely to use public transportation. Should you enlarge your company’s fitness space? Change your paid-time-off policy to encourage less frequent, longer vacations? Ease the dress code? Offer discounts for the light rail?

Try any one or a combination of those things, but chances are you won't be better able to predict the outcome than had you made the same adjustments without the data findings. Unless we're willing to run experiments to see what works, we're left with our intuition when deciding what to do in light of all that new information. And that means that sometimes we’ll make things worse rather than better.

And Ron? He convinced the doctor to wait a month without blood thinners and take another x-ray to see if things had changed. A month later, the clot was gone.

Bob Nease is the former chief scientist of Express Scripts, and the author of The Power of Fifty Bits: The New Science of Turning Good Intentions into Positive Results (HarperCollins) as well as over 70 peer-reviewed papers.

The Fast Company Innovation Festival