It was 1925, and the car was destroying America's youth. "The general effect of the automobile," wrote Princeton University Dean Howard McClenahan, "was to make the present generation look lightly at the moral code, and to decrease the value of the home." With a car, the youngsters could drive anywhere on Sunday. McClenahan didn't think they'd drive to church. And if they didn't, he argued, they'd become devilish and depraved.
This did not come to pass. Nor did phonographs create a "marked deterioration in American music," as composer John Philip Sousa feared in 1906. Nor did the telephone "break up home life and the old practice of visiting friends," as the Knights of Columbus warned in 1926. Nor did writing—a growing activity in the ancient world—"create forgetfulness in the learners' souls, because they will not use their memories," as Plato himself hypothesized. Some 2,400 years later, the Atlantic floated the same thesis about search engines. Its headline: "Is Google Making Us Stupid?" Proclamations like these should remind us that every technological revolution will spawn naysayers, who for the most part should be ignored.
Today's dystopians, like yesterday's, shout from the greatest megaphones of their time. Newsweek has asked, "Is the web driving us mad?" Sherry Turkle, a high-profile professor at MIT, writes books with titles like Alone Together and argues that we're losing our ability to converse and form relationships. This summer, an organization called "Experience People" even kicked off a 20-city tour to warn that tech is "enslaving" us. And in Jason Reitman's October movie, Men, Women & Children, an unrelenting parade of lonely people are consumed by screens and type things like "just feeling so alone and empty." The doomsayers' tale has a power that makes Silicon Valley's little startup narratives seem quaint and self-centered. Their message is powerfully frightening: Technology isn't just changing how we do things; it's changing us. That's a hard narrative to counter.
Who cares if some people still worry that Facebook "friends" will doom real-world friendships? What's wrong with good old-fashioned hand-wringing? Plenty, actually. All this unwarranted mewling stalls progress that should unfold naturally from our connectivity.
For starters, says Robert D. Atkinson, president of a think tank called the Information Technology and Innovation Foundation, "It leads people in Washington, and policy makers, to change behavior." He recently went on a three-day policy retreat about the Internet of Things with senior congressional and Obama administration staffers. He was hoping they'd talk about global standards or government funding for experiments like Internet-connected parking meters. But they didn't, says Atkinson: "Virtually all the discussion was about risks. 'The Internet of Things is going to track us, monitor us, discriminate against us.' "
In the wake of revelations about the NSA and other organizations, it's understandable that people fret. But when those reasonable worries are replaced by visceral paranoia, naysayers can stop us from doing things that matter, like, say, digitizing our health care records. "The tech industry needs to do a lot more,"
Atkinson says. "They're thinking the technology will tell the story, and I don't think that's right. There's an embedded neo-Luddite class in the U.S. that has a stake in opposing technology."
The technophobes even hold back the children they claim to protect. The collision between technology and the classroom seems particularly threatening. InBloom, a Bill Gates-funded startup that would have made it much easier for schools to maintain data about their students and customize lessons accordingly, was recently shuttered because of parental privacy fears. "In presentations, people start arguing with me," says William Kist, former coordinator of the Adolescent and Young Adult Education Center at Kent State University, who advocates regularly for the increased use of technology in classrooms. "They say, 'I refuse to read anything off of a screen. I want my class to be completely paper-based.' " Psychologist John Grohol, founder of the popular resource site PsychCentral.com, worries that incorrect reports connecting Facebook and depression will lead some shrinks to prescribe a withdrawal from the social network, when in fact the problem is likely to be deeper, and the network itself might be the child's best support. That might sound far-fetched, but it's downright realism compared to the hobgoblins conjured by the worrywarts.
Researchers track the current wave of techno-fear to 2006. It was the year Blu-ray was introduced, the first tweet was sent, and elderly Senator Ted Stevens described the Internet as "a series of tubes." It was also when Duke University released a study showing that the size of people's personal networks had shrunk since the late 1980s. The authors guessed that technology was to blame; it was the largest intervening variable between the Reagan Era and today. The irony was appealing—an engine for togetherness has pulled us apart—and the media piled on.
But here's the thing: That survey never asked respondents anything about their Internet usage, instead just tallying people's friends and associates. When the Pew Research Center repeated the study in 2008 (and again in 2011), it filled in the blank. The new results showed an important distinction: Internet users reported all kinds of increased social well-being. Nonusers were the ones reporting decreased quality of life.
This updated study was largely overlooked, as were others finding that the Internet has a positive (or, at worst, neutral) effect on people, and that the most web-connected among us tend to lead happier, more engaging lives, are more trusting and more politically active, and take part in more culturally diverse networks. But the narrative of loss and loneliness has a pleasing, nostalgia-tinted warmth that comes from thinking that the era of handwritten letters and actual books was somehow more authentic and meaningful. Falsely romanticizing the past allows us to think that we stand at a threshold as some last vestige of better, tech-disconnected humans.
The truth is that as culture evolves our priorities change as well. "The mark of a learned person used to be, how much do you have in your head?" says Lee Rainie, director of Internet, science and technology research at the Pew Research Center. "But in the era where you can literally look up the answer in your smartphone, the capacity to do rapid pattern recognition is elevated. Does that make for a dumber or smarter society? Who knows? It makes for a different society."
Frequently, when writers want evidence of an insidious Internet effect, they call Gary Small. He's a neuroscientist at UCLA who once ran an experiment that became famous in dystopian circles. He watched 12 tech-savvy and 12 tech-ignorant people use search engines while inside an MRI machine. The result: There was more neural activity in the noggins of tech-savvy users; familiarity with the Internet, in other words, had done something to them. Newsweek has cited this to suggest that the Internet has an unnatural effect, and The Guardian described it as "altering your mind." Small himself wrote a book called iBrain, encouraging the speculation.
"You're the third journalist I've talked to today," he says when I call to learn more. He explains the results like this: "The brain is a very responsive organ. What you expose it to will alter its structure and its function." So, I ask, what would have happened if that car-fearing dean from Princeton had access to an MRI machine? Had he been able to watch the brains of drivers and nondrivers, would he have seen the same neural activity that Small saw in his Internet experiment?
"You'd see the same pattern, probably. Yeah," Small says.
So here's a narrative Silicon Valley might try to run with: Our brains changed to meet the challenge of driving cars. They changed so we could dance to recorded music. Now we are witnessing more change, and our brains will change again. Yes, change can be scary. But it's what we're built for.
A version of this article appeared in the November 2014 issue of Fast Company magazine.