Against the Smartphone: A Rebuttal to a Rebuttal
The smartphone is not just another new technology, just like the atomic bomb wasn’t simply another new weapon of war. The device we always have at arm’s length is a quantum change, an innovation that has not just altered how we encounter information but how we process it and also how we live and how we think. And the data—perhaps preliminary but nevertheless striking—suggests those changes are profound, profoundly negative and probably irreversible.
I believe that’s true even though my friend and colleague Mitch Stephens doesn’t think so.
In a recent post on this site, Mitch argued that what he sees as a panic over our current obsession with smartphones and new screen-based technology is overwrought and ahistorical. He took particular aim at a previous post here, an excerpt from a column by the journalist James Marriott, who saw the smartphone as evidence of a “post-literate world … characterized by simplicity, ignorance and stagnation.”
A major part of Mitch’s demolition of Marriott’s view is his argument that new forms of communicating always have caused defenders of the old ways to proclaim “the sky is falling.” Yet, Mitch pointed out, humankind has continued to progress. And then he cites previous “technologies”—writing, the printing press, movies, television and so on—as examples of how the new was ever greeted with alarm, and turned out pretty much ok.
But just because there have been previous and obviously premature initial examples of antipathy to change that ended up beneficial doesn’t mean the scenario will repeat itself. That would be an inductive fallacy, assuming because something has happened a certain way, it will continue happening that way—particularly without considering how conditions may have changed. The past isn’t flawlessly predictive; past patterns can guide, but they don’t guarantee.
Unlike Mitch, I’m neither historian nor academic, but it seems clear to me that previous innovations in how we transmit knowledge have expanded and deepened our capabilities, wired our brains more effectively. They enhanced, in different ways, our understanding. Each successive wave literally improved our ability to think.
And this has been true in particular of reading. According to research, reading every day may lead to a longer life, slow cognitive decline, improve sleep, reduce stress. It is associated with better memory, improved verbal fluency and attention span. It might be the key to living longer; researchers at Yale found that reading books could reduce mortality by up to 20 percent—and the survival advantage was significantly higher for book readers than for those reading magazines and newspapers.
Reading fiction for at least 30 minutes a day, according to their study, could add an average of two years to readers’ lives. The effect was still present even with controlling factors like sex, wealth, education and health problems.
Mitch thinks that reading print has its limitations, but it’s precisely those limitations that make it so useful and beneficial. Using your imagination while reading fiction, researchers found, may help keep the mind active, which translates to those health benefits conducive to a longer life.
And reading doesn’t just fill your brain with information; it actually changes the way your brain works, for the better. The effect can literally be seen in brain waves when you read. If a character in your book is playing tennis, areas of your brain that would light up if you were physically out there on the court yourself are activated. Deep reading, what happens when you curl up with a great book for an extended period of time, also builds up our ability to focus and grasp complex ideas. Studies show the less you really read the more these essential abilities wither.
A growing body of scientific literature shows reading also increases empathy by nudging us to take the perspective of characters very different from ourselves. Which may be why, as Marriott points out, that reading “helped to destroy the orderly, hierarchical and profoundly socially unequal world. The reading revolution was a catastrophe for the ultra-privileged and exploitative aristocrats of the European aristocratic ancien regime.”
Is it coincidence that the recent rise of the smartphone has mirrored the rise of anti-science, authoritarian movements throughout the western world?
So, is the move from print to screens deepening our intellectual capabilities in the same way? Is it making us more empathetic, enhancing our brains? While Mitch accurately notes that it’s probably too early to fully judge the impact of the smartphone, which is still in its comparative infancy, the data already in is not at all encouraging.
As multiple researchers have found, the quick “hits” of news, entertainment and other information we get all day long, without context, from our smartphones are actively diminishing our prefrontal neural networks and affecting our ability to plan, organize and solve problems. In just a few decades, technology has managed to interrupt an evolutionary process that took millions of years to develop.
As humans developed language, for example, the brain developed areas that process written and spoken words. We developed the intricate brain networks that allow us to manipulate symbols, create hypothetical scenarios and coordinate events over time—because we have evolved to respond to complex stimuli. When we diminish the nature and fabric of these complex stimuli, as we are doing now, we weaken those networks.
Clinical neuropsychologist Amanda Sacks-Zimmerman points out that while evolution took millennia to change brains, technology is doing that now at warp speed. Instead of helping the brain develop, it is retarding the development of the prefrontal cortex and thus threatening our executive functions, which guide complex behavior.
The brain needs multi-faced experiences, which reading provides, but scrolling through your phone doesn’t. And so: a growing body of literature suggests many of today’s declining academic performances are the results of degraded executive function. Marriott, among many others, references specific evidence the change from print to screens has so far been significantly detrimental, and done permanent damage.
“The collapse of reading is driving declines in various measures of cognitive ability,” Marriott argues. “After the introduction of smartphones in the mid-2010s, global PISA scores—the most famous international measure of student ability—began to decline. Most intriguing—and alarming—is the case of IQ, which rose consistently throughout the 20th century but which now seems to have begun to fall.”
Does Mitch have an answer to this, other than suggesting—without evidence—that the measurements are measuring the wrong things or measuring them in the wrong way?
Sure, it takes a long time before new forms of media manage to demonstrate their real talents and, as Mitch says, “we have to stop comparing the accomplishments of a teenager like the iPhone with the achievements of even a septuagenarian like television, let alone a medium that has been around for half a millennium like print.” But he is already comparing them, by suggesting that reading and the novel are limited in what they can do, what they can show and can be “longwinded and, for the most part, rather sluggish. Perhaps because it can’t easily show how things are, print likes to spell things out.”
And then he criticizes James Joyce for not spelling things out.
Print is obviously more than just “good at description.” It takes us inside the mind of others, it captures interior lives, offers insight into motivations and emotions. Reading increases empathy by nudging us to take the perspective of characters very different from ourselves.
You don’t need descriptions of things you can just see, Mitch writes, but that’s exactly the point of great literature—it enables us to see things we haven’t been able to see. Maybe screens can get to the point, whatever it is, more quickly, but does it? Or does it obscure the point, or miss the point completely? Or not care about the point?
As an example of print’s limitations, Mitch cites Joyce’s sometimes intractable works. It is, of course, a strawman argument, using one of the most obscurantist works in literary history, as opposed to the thousands upon thousands of more accessible works that have revealed human nature in ways no other media can, or has. And even with Joyce—and I have made it all the way through “Ulysses”—emotion and insight are plentiful among the density of the prose.
Maybe one can imagine a video sometime in the future that is ultimately more capable of representing human thought than print, or at least capable of expressing a different kind of intelligence than print. But right now, every indication is that it will be more reductive and will actually limit the brain’s ability to think and to imagine.
Yes, we don’t know how screens will evolve and it would be foolish to hazard a guess as to what they might offer hundreds of years in the future. Mitch is a techno-optimist, and looking to the past, sees examples of how it may grow. I hope he’s right, but I doubt it.

