The Cost of Distraction: What Kurt Vonnegut Knew

“The year was 2081, and everybody was finally equal.”

So begins the late Kurt Vonnegut’s 1961 short story, “Harrison Bergeron.” In 2009, Chandler Tuttle released a 25 minute film version of the story titled 2081, and you can watch the trailer at the end of this post.

Vonnegut goes on to describe the conditions of this equality:

They weren’t only equal before God and the law. They were equal every which way. Nobody was smarter than anybody else. Nobody was better looking than anybody else. Nobody was stronger or quicker than anybody else. All this equality was due to the 211th, 212th, and 213th Amendments to the Constitution, and to the unceasing vigilance of agents of the United States Handicapper General.

This government enforced equality was achieved by imposing prosthetic technologies on those who were above average; these prosthetics, however, were designed not to enhance, but to diminish.  So, for example, ballerinas who might otherwise rise above their peers in grace, elegance and beauty,

were burdened with sashweights and bags of birdshot, and their faces were masked, so that no one, seeing a free and graceful gesture or a pretty face, would feel like something the cat drug in.

Then there were those of above average intelligence like the title character’s father, George Bergeron.

[He] had a little mental handicap radio in his ear. He was required by law to wear it at all times. It was tuned to a government transmitter. Every twenty seconds or so, the transmitter would send out some sharp noise to keep people like George from taking unfair advantage of their brains.

Whenever George began to formulate a complex idea, which often involved questioning the status quo, a sharp, piercing noise would shoot in his ear distracting him and derailing his train of thought.  Sometimes the noise was like a siren going off, other times “like somebody hitting a milk bottle with a ball peen hammer.”  Regular and incessant, the distraction overwhelmed and undermined natural intelligence.

George’s son, Harrison Bergeron possessed gifts and abilities that rendered him an especially potent threat to the regime of equality.  Because of this he was taken away and locked up by the authorities when he was fourteen.  Midway through the story, however, as George and his wife Hazel watch encumbered ballerinas dancing on television, a news bulletin interrupts the performance.  A ballerina takes over from a stuttering  announcer to read the bulletin.

“Harrison Bergeron, age fourteen,” she said in a grackle squawk, “has just escaped from jail, where he was held on suspicion of plotting to overthrow the government. He is a genius and an athlete, is under-handicapped, and should be regarded as extremely dangerous.”

Shortly thereafter, Harrison, whose debilitating prosthetics made him look “like a walking junkyard,” bursts into the building.  He effortlessly rips off the multiple “handicaps” that had been attached to his body in an unsuccessful effort to equalize his prodigious strength and ability.  He then proclaims himself emperor, declaring to the wonder-struck onlookers, “Now watch me become what I can become!”

Having been joined by a beautiful ballerina who came forward to be his empress, they dance.  They dance majestically and preternaturally breaking not only “the laws of the land,” but “the law of gravity and the laws of motion as well.”  And while they danced so high they kissed the ceiling,

Diana Moon Glampers, the Handicapper General, came into the studio with a double-barreled ten-gauge shotgun. She fired twice, and the Emperor and the Empress were dead before they hit the floor.

And just like that, equality is restored.

“Harrison Bergeron” is a nicely executed short story that can be read from a number of perspectives, yielding insights that can be variously applied to political, economic, or cultural circumstances.  As I read it, the story shares a certain sensibility with both Orwell’s 1984 and Huxley’s Brave New World.  It has been noted by, among others, Huxley himself that Brave New World pictures a more likely image of the future because it is not posited on a heavy-handed totalitarianism.  It is, rather, a freely embraced dystopia.  I want to suggest that, in “Harrison Bergeron,” Vonnegut offered us an Orwellian adumbration of one particular dimension of our Internet soaked world that has in fact emerged along a more Huxleyian trajectory.

Consider the manner in which the advantages of the intellectually gifted are equalized in Vonnegut’s story — distraction, regular and constant distraction.  The story provides a vivid and disturbing image of the consequences of perpetual distraction.

We’ve noted more than a few critics who have been pointing to the costly consequences of living with the perpetual distractions created by the very nature of the Internet and the ubiquity of portable tools which allow us to be always connected, always accessible.   Recently a group of neuroscientists made news by taking a trip into the Utah wilderness to disconnect long enough to appreciate the mental costs of constant connectivity and the perpetual distraction that comes with it.

In the world of 2081 imagined by Vonnegut, the distracting technology  is ruthlessly imposed by a government agency.  We, however, have more or less happily assimilated ourselves to a way of life that provides us with regular and constant distraction.  We have done so because we tend to see our tools as enhancements.  They promise, and often provide, pleasure, comfort, efficiency, and productivity.  What’s more, our distractions are not nearly so jarring as those that afflict the characters in “Harrison Bergeron”; in fact, our distractions can often be quite pleasant.

But might they also be inhibiting the development of our fullest potential?  Are we trading away certain real and important pleasures and possibilities?  Have we adopted technologies that in their democratizing power, also engender mediocrity? Do our perpetual distractions constitute a serious impairment of our cognitive abilities?  Can we learn to use our tools in a way that mitigates the costs?

These are just a few of the questions suggested by “Harrison Bergeron.”  Our future, at least in part, may hinge on the answers.

Five Neuroscientists Get On a Raft …

Click for video of the trip from the NY Times

No punchline.  Five neuroscienctists really did get on a raft.

They rafted the San Juan River in southern Utah during a week-long camping trip into the most threatening and inhospitable situation now imaginable:  beyond the reach of wireless signals, they were without Internet and without cell phones (although a satellite phone was available for emergencies).

The trip was conceived and planned by David Strayer, professor of psychology at the University of Utah, with the goal of understanding “how heavy use of digital devices and other technologies” impacts the way we think and act.

Matt Richtel’s account of the journey, “Outdoors and Out of Reach, Studying the Brain,” is part of the NY Times‘ ongoing series, Your Brain On Computers.  That the members of the expedition disagreed from the outset about the impact of the digital world on the brain makes this an engaging read and suggest that the conversation on the trip was quite lively.

Along the way they debate the false sense of urgency engendered by always-on technology, the power of nature to refresh the brain’s ability to focus, the degree to which the brain can adapt to multitasking environments, the best methods and tools to measure digital technologies effect on the brain, and more.

As the days pass, the conversations become more fruitful.  Or, as Richtel put it, “as the river flows, so do the ideas.”  “There’s a real mental freedom in knowing no one or nothing can interrupt you” according to one of the neuroscientists.  Another observes, “Time is slowing down.”

Strayer has coined the term “third-day syndrome” to describe the subtle and not so subtle shifts in attitude and behavior that begin to manifest themselves after someone has been “unplugged” for three days.  The experience leads one of the scientists to wonder, “If we can find out that people are walking around fatigued and not realizing their cognitive potential … What can we do to get us back to our full potential?”

Even without knowing exactly how the trip affected their brains, the scientists are prepared to recommend a little downtime as a path to uncluttered thinking.

If you can, take your downtime in a more natural environment.  According to a University of Michigan study discussed in the article, we seem to learn better after taking a walk in the woods as opposed to a walk down a city block.

For a much more prosaic discussion of the same issues see “The Internet: Is It Changing the Way We Think?” from The Guardian.  Of the five responses, Maryanne Wolf’s seemed to me most useful.  Here is an excerpt:

For me, the essential question has become: how well will we preserve the critical capacities of the present expert reading brain as we move to the digital reading brain of the next generation? Will the youngest members of our species develop their capacities for the deepest forms of thought while reading or will they become a culture of very different readers – with some children so inured to a surfeit of information that they have neither the time nor the motivation to go beyond superficial decoding? In our rapid transition into a digital culture, we need to figure out how to provide a full repertoire of cognitive skills that can be used across every medium by our children and, indeed, by ourselves.

Identity Reset

A few days ago we noted the conflicting opinions of Jeffrey Rosen and David Dylan Thomas on how much the Internet will or will not remember.  In a recent interview with The Wall Street Journal’s Holman Jenkins, Jr., Google CEO Eric Schmidt appears to think the Internet will in fact have a very long memory:

“I don’t believe society understands what happens when everything is available, knowable and recorded by everyone all the time,” he says.

A possible response?

He predicts, apparently seriously, that every young person one day will be entitled automatically to change his or her name on reaching adulthood in order to disown youthful hijinks stored on their friends’ social media sites.

Among the other things Schmidt said, also apparently seriously:

“I actually think most people don’t want Google to answer their questions,” he elaborates. “They want Google to tell them what they should be doing next.”

Super.

“You have to be somebody before you can share yourself”

Will the Internet make it impossible to make clean starts in life?  Will every word and every picture we have posted, however ill advised, find a way to haunt us?  This is the fear legal scholar Jeffrey Rosen articulated in his NY Times piece, “The Web Means the End of Forgetting.” That is the same piece that led me to wonder several days ago if we might need a new economic statistic to track social media induced unemployment.

According to David Dylan Thomas at In Medias Res, the real problem might actually be the opposite of what Rosen feared.  In a post titled “The Myth of Online Inertia,” Thomas argues that “things disappear from the web” all the time.  The evolution of hardware and file formatting renders much of what is produced potentially inaccessible with the passage of time.  “As a web manager,” Thomas explains,

I’ve overseen the overhaul of many a content management system, and there’s always a compatibility issue which forces editors and technology teams to ask the same question. How much? How much will it cost (in time and money) to convert how much information? Do we really want to bother reformatting 400 news stories that were published in 2000 to a whole new format on the off chance that someone will search for them? The answer is almost always no. And that’s just 10 years.

My sense is that both Rosen and Thomas are on to something and that if they were to sit down together to discuss their positions, a synthesis preserving elements of both arguments would emerge.  Regardless of how powerful the Internet’s long term memory proves to be, however, its short term memory is quite good and potentially quite damaging.  Consequently, we are becoming increasingly self-conscious and cautious about what we post and where.

Some are concerned enough to implement tools such as Google Goggles that are designed to keep us from sending that stupid drunken email that ends up costing us our job or a relationship.  A great deal of time and money is also being spent to keep individuals from not only ruining their own reputations with a misguided tweet, but also tarnishing the image of the institutions with which they are associated.  In a recent story about the effort colleges are putting forth to manage the social media activities of their student athletes, a consultant gave the very basic rule he tries to instill in student athletes:  if you would have a problem with your mother reading or seeing it, don’t post it.

This is good advice as far as it goes, I suppose.  Although, it would depend a great deal, wouldn’t it, on the sensibilities of each particular mom.  In any case, this all brought to mind a recent article in The New Republic by Jed Perl.  In “Alone, With Words,” Perl laments the loss of writing that  begins as and remains a private act.

Writing, before it is anything else, is a way of clarifying one’s thoughts. This is obviously true of forms such as the diary, which are inherently solitary. But even those of us who write for publication can conclude, once we have clarified certain thoughts, that these thoughts are not especially valuable, or are not entirely convincing, or perhaps are simply not thoughts we want to share with others, at least not now … I believe that most writing worth reading is the product, at least to some degree, of this extraordinarily intimate confrontation between the disorderly impressions in the writer’s mind and the more or less orderly procession of words that the writer manages to produce on the page.

Most of what is made public in the arena of social media was never private in Perl’s sense, at least not for very long at all.  We are becoming used to the idea of providing a more or less real time feed of our thoughts and actions to the world.  The process of clarification and crafting that Perl describes has been replaced by the urge to publicize immediately.  Little wonder then that some of what we make public is damning and much of it is quite inane.

Citing Jaron Lanier, Alan Jacobs makes a point that we seem to have forgotten:

“You have to be somebody before you can share yourself.” And the process of becoming somebody takes time, effort, discipline, and study.

That process also tends to happen when we have preserved  a certain private space for our selves.  Social media and the Internet have given us an unparalleled ability to make our thoughts, our writing, our  pictures, our very selves public.  Our task now may be to carve out and preserve a private space that will help render what we make public meaningful and worthwhile.  Or, at least not potentially disastrous.