Telling Small Lies Leads Us Down A Slippery Slope To Big Whoppers

The strength of the signals in our brain’s emotional center decrease each time we lie, making it easier and easier to do.

Telling Small Lies Leads Us Down A Slippery Slope To Big Whoppers
[Photo: Elijah Hail]

Once you start lying, you get better and better at it, and you will lie more and more. Like anything, says a new study from University College London, practice makes perfect.


When we lie, a little piece of us dies. Or rather, the strength of the signals in our brain’s amygdala (its emotional center) decrease each time we lie, allowing for “a gradual escalation of self-serving dishonesty,” according to the work. Not only that, but this decrease in sensitivity can be measured and used to predict “magnitude of escalation of self-serving dishonesty on the next decision.”

And the most surprising part of all is that we have a mechanism designed to support this slippery slide into deceit.

To test their theory, based on anecdotal evidence of small deceits snowballing into huge deceptions, over and over, the researchers used an MRI machine to scan participants’ brains while they lied. The subject was asked to tell another participant how much money was in a jar filled with pennies. The rules were varied to make lying about the amount either beneficial or detrimental to the liar or to the confederate. The liar was also told that the other participant had no idea of the scam they were running, and that they were actually working together for the best mutual outcome.

The results showed that, over time, the liar’s lies escalated, but only when they were engaged in self-serving dishonesty. If they lied to benefit only the other person, no escalation occurred. This, say the researchers, shows that we only get more dishonest over time if our lies benefit ourselves.

This behavior may carry over to other kinds of decision-making. For instance, the researchers speculate that we may become equally desensitized to risky or violent behavior.

One crucial difference between the real world and the experiment, though, is external feedback. In the experiments, there were no consequences for lying. In real life, we have laws and morals to keep us in check. And, given the ease with which unchecked dishonesty can escalate to catastrophic moral corruption, we might design our organizations to be tougher on these small transgressions in order to prevent future disaster.

About the author

Previously found writing at, Cult of Mac and Straight No filter.