In an era of “alternative facts,” Rex Sorgatz’s The Encyclopedia of Misinformation helps put things in perspective. The author of the book, released earlier this year, recently spoke with Fast Company about why he wrote the guide to hundreds of scams, flim-flams, and downright lies that millions, at some point in history, believed to be true. Turns out, “fake news” is nothing new.
Long before a certain prolific tweeter needed daily fact checking of his carnival barking, there was the ultimate American circus showman, P.T. Barnum, who was as known in the mid- to late-1800s for his public hoaxes that had people lining up to see a woman he claimed was the 161-year-old nursemaid of George Washington and the Cardiff Giant (a fake 10-foot-tall petrified man that was said to be evidence that giants roamed the planet).
Sorgatz writes about Barnum and much more, including the more modern-day Lonelygirl15, the video blog from a made-up teenage girl that ran on YouTube in 2006-2008. And he gets into other cultural touchstones that left people wondering what was real and what was not (remember The Blair Witch Project?). And who can forget the viral video of the pizza rat carrying a slice home? It was a trained rat in a stunt staged by performance artist Zardulu.
Sorgatz, who is also the founder of New York media consultancy Kinda Sorta Media, has perhaps one of the longest subtitles to The Encyclopedia of Misinformation. It’s: A Compendium of Imitations, Spoofs, Delusions, Simulations, Counterfeits, Impostors, Illusions, Confabulations, Skullduggery, Frauds, Pseudoscience, Propaganda, Hoaxes, Flimflam, Pranks, Hornswoggle, Conspiracies & Misc. Fakery. Phew!
Fast Company: What motivated you to write this book, and why now in particular?
Rex Sorgatz: Well, I started it a few years ago before there was any havoc occurring in our political system, but at the time, I was interested in general ideas about deception and misinformation through time and history. I was about halfway done when the election occurred, and all of a sudden, I realized I was writing something that was even more topical than when I started.
FC: Has the internet made misinformation that much more prevalent?
RS: I have mixed feelings about it. When you write a book about the history of misinformation and you look back at the 19th century or even earlier—there are entries on Greek philosophers and the invention of history—when you look back on those you start to go, maybe things aren’t so bad as they might seem in the current political moment.
So, I don’t think that the internet is necessarily directly tied to misinformation, but I think it introduced a new strain of thinking and a new set of problems.
FC: What sort of problems?
RS: Well, obviously the spread has gone populist. I think when I look back on misinformation in the past, it looks back at publishers and people spreading it by making up hoaxes or by having some malevolent intent, and the difference now is . . . that everyone can do it . . . It was once that the media existed to inform people, and now we’re in the situation where social media has made such that people now spread that disinformation faster.
FC: You mention the idea of the shifting Overton window—that the range of ideas within political acceptability has changed over time, and you say that social media has made it easier for ideas to enter that window. Have other technologies played a similar role in the past?
RS: One of my favorite entries in the book is on the mechanical Turk,, [the device from the 18th century that was claimed to be a chess-playing machine but concealed a human player.] It wasn’t used to spread disinformation, but was itself a kind of wheeled-onto-the-floor act of it.
We can go back as far as the printing press. The history that’s often written about the printing press is that it opened this new opportunity for information to spread, and we tend to look at that optimistically. But what people often forget is that the invention of the printing press did not lead directly and immediately to the Enlightenment, which is somehow how we think of it—that suddenly books were in everyone’s hands . . .
In fact, what happened is it precipitated the Reformation, which was followed by the counter-Reformation and a great, epic period in which bad information was spread . . . There’s a great stat out there that by the time we were at peak publishing, Stalin had more books in print than Agatha Christie, and I think that just shows that these technologies can be used for good or for bad over time.
FC: You’ve mentioned current politics a few times, but you also just mentioned Stalin and you include earlier political leaders like [1930s Louisiana populist] Huey Long. What’s changed in current times?
RS: I would argue, not much. I wrote that “Huey Long” entry specifically with the tone that I hope it’s obvious that this rhymes with the present. It’s a good example where I don’t explicitly compare him to a political leader that may be in power now. I just outline the facts of his life and his rise to power and suggest that had he not been assassinated, he likely had a good chance of being president, and would have been our first populist, authoritarian president.
And I don’t draw any direct parallels to modernity there, but I think the insinuation is obvious.
FC: Based on your research, will people reading your book be less susceptible to misinformation?
RS: Of course, I hope so. I think that my project is more of one to think critically, but also to wrap your head around some of the better social science that’s going on out there, and to think about how some of our beliefs may be wrong, and that we have a set of tools to look and investigate not only the media, but also historical thinking.
I would say the book attempts to do something that’s very playful with the subject matter, and so if you come away at the end of it with a new understanding or a new way of thinking about information, that’s great. It doesn’t do so with a specific intent, though—I’m not trying to convince anyone of anything. I’m just trying to build equipment for them to think about information.
FC: You mention decades of people fighting against disinformation— Harry Houdini, the Amazing Randi [both magicians who also became well known for debunking claims by psychics and others said to have supernatural powers], and the people behind Snopes.com [which debunks urban legends and viral misinformation]. Has that role changed with the internet and social media?
RS: I suppose so. The Amazing Randi was an interesting case—very specific to his time. One could say that Snopes is an update of the Amazing Randi, who was important in the ’70s because there was a different kind of disinformation campaign going on from, essentially, New Ageists . . . The myths of that time included spoon bending. [Randi spent decades challenging psychic and TV personality Uri Geller, who claims to bend spoons with his mind.] It was the paranormalists that we were worried about at that time . . .
It’s funny, though, I think we look back on that weird moment where people were falling for these paranormal gimmicks and schemes and it seems sort of silly now, but in some ways, it is the same architecture of thinking, and I suppose we need debunkers in this age dealing with new sources of disinformation as much as the Amazing Randi did in his time.
FC: Do you think we’ll see new types of disinformation in the coming years, thanks to new technology?
RS: It isn’t so much that the disinformation changes. It’s that the means of distribution changes. I guess I’m not overly optimistic at the moment about where stuff is headed.
I think the next election will have a deep reckoning about if we’ve cleaned up [disinformation on] social media. I suspect that these new mediums will still be used to suppress voters and create more doubt in our institutions.
Another topic that has come up since the book came out is “deep fakes.” I suspect we’re going to have a case very soon where someone is able to use this deep-fake technology that allows you to effectively take the image of someone and insert it into a video, and basically do a PhotoShop of a video in transplanting one person on top of another.
The technology so far has been used for pornography, mostly, but I bet we see something in the future where it’s utilized. And I think the interesting thing is it probably won’t be used as a method of actual spreading of disinformation, but I bet what happens is the suggestion occurs where people say, ‘That thing is actually faked’ when it was not, but the idea of it being faked is more possible because of the advanced technology.
My biggest fear isn’t so much that we spread more disinformation. It’s that information itself becomes more dubious—that we’re unable to tell the real from the artificial.