Admit it: You hold a few contradictory beliefs—maybe more than a few. We all do. Many of them we aren’t even aware of, and the reason we aren’t aware of them has to do with the way our brains process, store, and retrieve knowledge. And in order to do that well, they turn us all into self-contradicting messes, at least some of the time. Here’s how, and how come.
There are lots of contradictions in people’s strongly held beliefs. Someone might preach self-sufficiency in politics, but coddle their children. An individual might oppose abortion on the grounds that human life is sacred and may still support the death penalty for convicted murders. A person might argue for the freedom of individual expression in the arts but want hateful speech to be regulated.
There’s a pragmatic reason for these contradictory beliefs. A core principle that you hold and don’t want to have violated is called a “protected value,” which you don’t even like to consider violating. Observing other people violate one’s own protected values can cause feelings of anger and even outrage. And when we contemplate violating our own protected values, we feel guilt and shame.
The thing is, once you have more than one protected value, those values are very likely to come into conflict at some point. People who oppose abortion and physician-assisted suicide, but who favor the death penalty for murderers and deadly military force for regimes perceived as threats to American lives and values, are experiencing this kind of conflict. They have two deeply held values—the sanctity of life and the prime importance of security—and different circumstances require making a choice between the two.
Such choices are rarely explicit, and most people aren’t aware of the inconsistencies in beliefs like this until it’s pointed out to them. To be fair, philosophers and ethicists have spent centuries untangling dilemmas like these, and many would argue (often compellingly) that clashing ideals—political or otherwise—are perfectly defensible, as are contingent approaches toward acting on them. And maybe so. But our brains don’t care about any of that.
In other words, if you learn some new fact that turns out to be inconsistent with something else you know, there are no automatic mechanisms in your brain that point out the inconsistency and force you to resolve it. Instead, you simply end up with two different beliefs that are not consistent.
Almost any statement you can make about human behavior is true only in certain circumstances. The trick to understanding behavior is to know the circumstances in which behaviors are going to happen.
The same thing is true with beliefs. When someone says, “I believe that human life is sacred,” or “I believe in individual freedom,” that statement includes an unstated disclaimer that goes something like “all else being equal.” But there are nearly always circumstances that lead to the violation of any broad belief or value statement.
It would be too much work for the brain to have to enumerate all of the exceptions to the rules you believe in, so it does something easier instead: It associates beliefs with specific situations and makes it easier to retrieve those beliefs in the situations with which they are associated.
Suppose you travel to a national park. There are signs all over the park warning people to beware of bears, so you learn that you shouldn’t go near them—you should be afraid to. Later, you go a zoo. There’s a bear there, too, but you needn’t fear it because there moats and fences to protect you.
Theoretically, your brain could first learn a general rule to deal with this, like, “Be scared of bears,” and then learn all kinds of exceptions to that rule. Or it could simultaneously learn both the rule and the context in which it was learned—which is exactly what your brain does. That makes it easier to recall the information again, in the right context, in the future.
Because this system works pretty well, most of the time you don’t need to think about the fact that your beliefs may be contradictory as a result of being contextual. But calling to mind your contradictory beliefs leads you to notice that they aren’t consistent. (There seems to be an endless reservoir of people who delight in pointing out your inconsistencies to you, particularly on the internet.) In those situations, you have two options.
One is to follow the “it depends” strategy: You make a mental note that your beliefs aren’t really contradictory. Instead, one belief holds in one set of circumstances, and the opposite holds in other circumstances. This has the benefit of being cognitively true.
Sometimes, though, you resolve the contrast between beliefs by choosing one over the other. This strategy is the one we use in science. In a scientific study, there are often competing theories that attempt to explain some aspect of the world. When two theories conflict, researchers use data to decide which one to believe. Relying on the collection and analysis of data to determine whether theories are wrong is itself a protected value in science. The whole process forces conflicting ideas into stark juxtaposition in an effort to resolve conflicts.
Individuals, however, are less often forced into such quandaries day to day. One belief can happily coexist with other conflicting beliefs until someone or something highlights the contradiction. The resulting dissonance in some cases may lead to a careful reexamination of values, or it may lead to an expedient rationalization and a quick change of topic. All the same, we’re capable of effortlessly holding disparate beliefs, even when they’re directly challenged.
“Do I contradict myself?” Walt Whitman wrote. “Very well then, I contradict myself (I am large, I contain multitudes).” He was right.
This article is adapted from Brain Briefs: Answering Questions to the Most (and Least) Pressing Questions About Your Mind by Art Markman and Bob Duke. It is reprinted with permission.