The phrase “alternative facts” has recently made the news in a political context, but psychiatrists like me are already intimately acquainted with the concept—indeed, we hear various forms of alternate reality expressed almost every day.
All of us need to parse perceived from actual reality every day, in nearly every aspect of our lives. So how can we sort out claims and beliefs that strike most people as odd, unfounded, fantastical, or just plain delusional?
First, we need to make a distinction often emphasized by ethicists and philosophers: that between a lie and a falsehood. Thus, someone who deliberately misrepresents what he or she knows to be true is lying–typically, to secure some personal advantage. In contrast, someone who voices a mistaken claim without any intent to deceive is not lying. That person may simply be unaware of the facts, or may refuse to believe the best available evidence. Rather than lying, he’s stating a falsehood.
Some people who voice falsehoods appear incapable of distinguishing real from unreal, or truth from fiction, yet are sincerely convinced their worldview is absolutely correct. And this is our entrée into the psychiatric literature.
In clinical psychiatry, we see patients with a broad spectrum of ideas that many people would find eccentric, exaggerated, or blatantly at odds with reality. The clinician’s job is, first, to listen empathically and try to understand these beliefs from the patient’s point of view, carefully taking into account the person’s cultural, ethnic, and religious background.
Sometimes, clinicians can be wildly mistaken in their first impressions. A colleague of mine once described a severely agitated patient who was hospitalized because he insisted he was being stalked and harassed by the FBI. A few days into his hospitalization, FBI agents showed up on the unit to arrest the patient. As the old joke goes, just because you’re paranoid doesn’t mean they aren’t after you!
We can think of distortions of reality as falling along a continuum, ranging from mild to severe, based on how rigidly the belief is held and how impervious it is to factual information. On the milder end, we have what psychiatrists call overvalued ideas. These are very strongly held convictions that are at odds with what most people in the person’s culture believe, but which are not bizarre, incomprehensible, or patently impossible. A passionately held belief that vaccinations cause autism might qualify as an overvalued idea: It’s not scientifically correct, but it’s not utterly beyond the realm of possibility.
On the severe end of the continuum are delusions. These are strongly held, completely inflexible beliefs that are not altered at all by factual information, and which are clearly false or impossible. Importantly, delusions are not explained by the person’s culture, religious beliefs, or ethnicity. A patient who inflexibly believes that Vladimir Putin has personally implanted an electrode in his brain in order to control his thoughts would qualify as delusional. When the patient expresses this belief, he or she is not lying or trying to deceive the listener. It is a sincerely held belief, but still a falsehood.
Falsehoods of various kinds can be voiced by people with various neuropsychiatric disorders, but also by those who are perfectly “normal.” Within the range of normal falsehood are so-called false memories, which many of us experience quite often. For example, you are absolutely certain you sent that check to the power company, but in fact, you never did.
As social scientist Julia Shaw observes, false memories “have the same properties as any other memories, and are indistinguishable from memories of events that actually happened.” So when you insist to your spouse, “Of course I paid that electric bill!” you’re not lying–you are merely deceived by your own brain.
A much more serious type of false memory involves a process called confabulation: the spontaneous production of false memories, often of a very detailed nature. Some confabulated memories are mundane; others, quite bizarre. For example, the person may insist–and sincerely believe–that he had eggs Benedict at the Ritz for breakfast, even though this clearly wasn’t the case. Or, the person may insist she was abducted by terrorists and present a fairly elaborate account of the (fictional) ordeal. Confabulation is usually seen in the context of severe brain damage, such as may follow a stroke or the rupture of a blood vessel in the brain.
Finally, there is falsification that many people would call pathological lying, and which goes by the extravagant scientific name of pseudologia fantastica (PF). Writing in the Psychiatric Annals, Drs. Rama Rao Gogeneni and Thomas Newmark list the following features of PF:
- A marked tendency to lie, often as a defensive attempt to avoid consequences. The person may experience a “high” from this imaginative story-telling.
- The lies are quite dazzling or fantastical, though they may contain truthful elements. Often, the lies may capture considerable public attention.
- The lies tend to present the person in a positive light, and may be an expression of an underlying character trait, such as pathological narcissism. However, the lies in PF usually go beyond the more “believable” stories of persons with narcissistic traits.
Although the precise cause or causes of PF are not known, some data suggest abnormalities in the white matter of the brain–bundles of nerve fibers surrounded by an insulating sheath called myelin. On the other hand, the psychoanalyst Helene Deutsch argued that PF stems from psychological factors, such as the need to enhance one’s self-esteem, secure the admiration of others, or to portray oneself as either a hero or a victim.
Of course, all of this presumes something like a consensus on what constitutes “reality” and “facts” and that most people have an interest in establishing the truth. But this presumption is looking increasingly doubtful, in the midst of what has come to be known as the “post-truth era.” Charles Lewis, the founder of the Center for Public Integrity, described ours as a period in which “up is down and down is up and everything is in question and nothing is real.”
Even more worrisome, the general public seems to have an appetite for falsehood. As writer Adam Kirsch recently argued, “more and more, people seem to want to be lied to.” The lie, Kirsch argues, is seductive: “It allows the liar and his audience to cooperate in changing the nature of reality itself, in a way that can appear almost magical.”
And when this magical transformation of reality occurs, whether in a political or scientific context, it becomes very difficult to reverse. As the writer Jonathan Swift put it, “Falsehood flies, and the Truth comes limping after it.”
Psychiatrists are not in a position to comment on the mental health of public figures they have not personally evaluated or on the nature of falsehoods sometimes voiced by our political leaders. Indeed, the “Goldwater Rule” prohibits us from doing so. Nevertheless, psychiatrists are keenly aware of the all-too-human need to avoid or distort unpleasant truths. Many would likely nod in agreement with an observation often attributed to the psychoanalyst Carl Jung: “People cannot stand too much reality.”
Ronald W. Pies, the editor-in-chief emeritus of “Psychiatric Times,” is professor of Psychiatry, lecturer on Bioethics & Humanities at SUNY Upstate Medical University; and clinical professor of Psychiatry, Tufts University School of Medicine. This story was originally published on The Conversation.