advertisement
advertisement
  • 03.02.12

7 Ways To Lie With Statistics And Get Away With It

Misleading with statistics is nothing new. Statistics are used to sell products, elicit support for a candidate, or get us to ‘Like’ things. Be wary of these seven common tactics used to knead statistical data into “dough.”

7 Ways To Lie With Statistics And Get Away With It

What does excess email usage have to do with low IQ scores? Nothing at all, but that doesn’t mean someone
didn’t create a connection to make a point.  

advertisement

I don’t know about you, but I am tired of the incorrect,
misleading, or just plain bogus statistics used to sell me a product, elicit
support for a candidate, or get me to ‘Like’ some new trend. I’m mad as hell and I’m not going to take it
anymore…and neither should you.

Misleading with statistics is called ‘statisticulation’ and
it is nothing new. In 1954, former Better Homes and Gardens editor Darrell
Huff wrote a small book called, How
To Lie with Statistics
, which is the best-selling statistics book of the
last 60 years, according J.
Michael Steele
, a professor of statistics and operations and information management at Wharton.

What was true in 1954 is
just as true today. According to Huff, here
are seven common tactics used to knead statistical data into “dough.”

  • Biased sampling:   This involves polling
    a non-representative group. For example, a survey that finds “41% of retail
    bank customers would use mobile banking if it were available,” becomes
    meaningless when you find out the survey only polled people on their mobile
    devices.
  • Small sample sizes:   Picking
    an adequate sample size is part science and part art, but sweeping statements,
    like “14% of companies plan to deploy cloud-based email this year” becomes
    suspect when the sample size is 24 companies. Another example of this kind of ‘statistics
    gone wild’ phenomenon was a “study” conducted by HP that found excessive email usage reduces a person’s IQ by 10 points.  
  • Poorly-chosen averages:   This often
    involves averaging values across non-uniform populations. For example, I recently saw an article that
    identified a neighborhood as one of the wealthiest in the city. The article went
    on to state that neighborhood residents had an average annual income of around $100,000.
    What the article failed to point out is that the neighborhood is in the process
    of gentrification; one part of the neighborhood is very wealthy, and the other
    part’s income levels are below the national average. Giving a single average
    value for two populations is incorrect and misleading. (The median value for
    income would be a better indication of the neighborhood income.) Another classic example of this is the story
    about the man who drowned in a pool of water whose average depth was 1 inch.
  • Results falling within the standard error:  No sampling or measuring technique is perfect; all inherently incorporate
    a degree of error. This means that a survey can only be as accurate as its
    standard error. Without getting technical, suffice it to say that the headline,
    “E-books Preferred Over Paper By Men More Than By Women” sounds remarkable
    until you find out that of the actual polling results found that 52% of men
    preferred e-books versus 49% for women, and the error of the survey was +/-5%.
  • Using graphs to create an impression:  Both of the charts below contain exactly the same information, but which
    one more accurately shows the increase in venture capital investment in mobile
    technologies between the years 2006-2007? The only difference between the graphs is the
    scale. Graphing data creatively provides a lot of room
    for creating false impressions.   The
    same goes for pictograms and infographics.
  • “The semi-attached figure”: This means
    stating one thing as a proof for something else. For example, if an ad says “15% of CEOs drive
    a Buick; more than any other brand”– what does that prove? The implication is
    that CEOs are some sort of authorities on cars. This is a common tactic. Remember
    the old Certs commercials, where the narrator says, “Certs. Now with Retsyn!” Did
    anyone even know what Retsyn is or why should we care?  
  • “Post-hoc fallacy”:   This is incorrectly asserting that there is a direct correlation between two findings. This is particularly
    nefarious but it is often more difficult to catch than the other tactics. For
    example, if a study finds that vegetarians have a higher average income than
    meat-eaters, it would be absurd to conclude that you can raise your income by abstaining
    from meat. But that is exactly what some ‘researchers’ do.

Huff presents an entire
chapter of how to identify spotty statistics, which I will revisit those in a future
post. In the meantime, the best advice, as always, is to be skeptical. Caveat
emptor!

I am collecting stories about bogus, misleading, or
inaccurate marketing statistics; please send me stories to dlavenda1@hotmail.com or tweet me @dlavenda.

Author David
Lavenda is a high tech marketing and product strategy executive who also does
academic research on the effects of information overload on organizations. He
is an international scholar for the Society for the History of Technology.

[Image: Flickr user MervC]

About the author

A technology strategist for an enterprise software company in the collaboration and social business space. I am particularly interested in studying how people, organizations, and technology interact, with a focus on why particular technologies are successfully adopted while others fail in their mission.

More

Video