“Are you really there?” — How not to become specatators of our lives

If you try to keep up with the ongoing debate regarding the Internet and the way it is shaping our world and our minds, you will inevitably come across the work of Jaron Lanier.  When you do, stop and take note.  Lanier qualifies as an Internet pessimist in Adam Thierer’s breakdown of The Great Debate over Technology’s Impact on Society, but he is an insightful pessimist with a long history in the tech industry.  Unlike other, often insightful, critics such as the late Neil Postman and Nicholas Carr, Lanier speaks with an insider’s perspective.  We noted his most recent book, You Are Not a Gadget: A Manifesto,not long ago.

Earlier this week, I ran across a short piece Lanier contributed to The Chronicle of Higher Education in response to the question, “What will be the defining idea of the coming decade, and why?” Lanier’s response, cheerfully titled “The End of Human Specialness,” was one of a number of responses solicited by The Chronicle from leading scholars and illustrators.  In his piece, Lanier recalls addressing the “common practice of students blogging, networking, or tweeting while listening to a speaker” and telling his audience at the time,

The most important reason to stop multitasking so much isn’t to make me feel respected, but to make you exist. If you listen first, and write later, then whatever you write will have had time to filter through your brain, and you’ll be in what you say. This is what makes you exist. If you are only a reflector of information, are you really there?

We have all experienced it; we know exactly what Lanier is talking about.  We’ve seen it happen, we’ve had it happen to us, and — let’s be honest — we have probably also been the offending party.  Typically this topic elicits a rant against the incivility and lack of respect such actions communicate to those who are on the receiving end, and that is not  unjustified.  What struck me about Lanier’s framing of the issue, however, was the emphasis on the person engaged in the habitual multitasking and not on the affront to the one whose presence is being ignored.

We are virtually dispersed people.  Our bodies are in one place, but our attention is in a dozen other places and, thus,  nowhere at all.  This is not entirely new; there are antecedents.  Long before smart phones enabled a steady flow of distraction and allowed us to carry on multiple interactions simultaneously, we wandered away into the daydreams our imagination conjured up for us.  My sense, however, is that such retreats into our consciousness are a different sort of  thing than our media enabled evacuations of the place and moment we inhabit.  For one thing, they were not nearly so frequent and intrusive. We might also argue that when we daydream our attention is in fact quite focused in one place, the place of our dream.  We are somewhere rather than nowhere.

Whatever we think of the antecedents, however, it is clear that many of us are finding it increasingly difficult to be fully present in our own experience.  Perhaps part of the what is going on is captured by the old adage about the man with a hammer to whom everything looks like a nail.  My most vivid experience with this dynamic came years ago with my first digital camera.  To the person with a digital camera (and enough memory), I discovered, everything looks like a picture and you can’t help but take it.  I have wonderful pictures of Italy, but very few memories.  And so we may  extrapolate:  to the person with a Twitter account, everything is a tweet waiting to be condensed into 140 characters.  To the person with a video recorder on their phone, everything is a moment to be documented.  To the person with an iPhone … well, pick the App.

In an article written by professor Barry Mauer, I recently learned about Andy Warhol’s obsessive documentation of his own experience through photographs, audiotape, videotape, and film.  In his biography of Warhol, Victor Bockris writes,

Indeed, Andy’s desire to record everything around him had become a mania.  As John Perrault, the art critic, wrote in a profile of Warhol in Vogue:  “His portable tape recorder, housed in a black briefcase, is his latest self-protection device.  The microphone is pointed at anyone who approaches, turning the situation into a theater work.  He records hours of tape every day but just files the reels away and never listens to them.”

Warhol’s behavior would, I suspect, seem less problematic today.  Here too he was perhaps simply ahead of his time.  Given much more efficient tools, we are also obsessively documenting our lives.  But what most people do tends to be viewed as normal.  It is interesting, though, that Perrault referred to Warhol’s tape recorder as a “self-protective device.”  It called to mind R. R. Reno’s analysis of the pose of ironic detachment so characteristic of our society:

We enjoy an irony that does not seek resolution because it supports our desire to be invulnerable observers rather than participants at risk. We are spectators of our lives, free from the strain of drama and the uncertainty of a story in which our souls are at stake.

“Spectators of our lives.”  The phrase is arresting, and the prospect is unsettling.  But it is hardly necessary or inevitable.  If the cost of re-engaging our own lives, of becoming participants at risk in the unfolding drama of our own story is a few less photos that we may end up deleting anyway, one less Facebook update from our phone, or one text left unread for a short while, then that is a price well worth paying.  We will be better for it, and those others, in whose presence we daily live, will be as well.

The Search

Kurt Vonnegut’s “Harrison Bergeron” presented us with a striking illustration of the potentially debilitating consequences of the constant distraction.  In that story the distraction is brutally imposed; but, as we noted last week, we choose our distractions.  In fact, we embrace our Internet-empowered distractions.  We love to be distracted and we crave diversion.  We can hardly stand it if we are without distraction or diversion for more than a few moments at a time.  We complain incessantly about our busyness, but were it all to stop we would hardly know what to do with ourselves.  This raises some interesting questions.  Why are we so keen to envelope ourselves in constant distraction?  Why do some of us develop an addictive relationship to the constant flow of distraction?  Why are we so uneasy when the distractions stop?

Back in June, I reflected on the theme of distraction and diversion on the heels of a post about the religious aura that sometimes surrounds our love affair with sports.  We were then, you will remember, at the height of World Cup fever.  I want to revisit some of those same thoughts and tweak them just a little bit as a follow up to Friday’s post on distraction and “Harrison Bergeron.”

Distractedness and the need for diversion are not new phenomenon of course.  Although the condition may now be intensified and heightened, it has been with us at least since the 17th century, and almost certainly before then.  It was in the 17th century that Blaise Pascal began assembling a series of notes on scraps of paper in preparation for a book he never wrote.  When he died at the age of 39 he left behind hundreds of barely organized notes which were later collected and published under the French title Pensées, or thoughts.  Pascal is today remembered, if at all, either for his law of fluid pressure or an argument for God’s existence known as Pascal’s Wager.  Neither quite does justice to the depth of his insight into what we used to call the human condition.

Pascal knew that we needed our diversions and distractions and that without them we would be miserable.  His description of the younger generation sounds wholly contemporary:

Anyone who does not see the vanity of the world is very vain himself.  So who does not see it, apart from young people whose lives are all noise, diversions, and thoughts for the future?  But take away their diversion and you will see them bored to extinction.  Then they feel their nullity without recognizing it, for nothing could be more wretched than to be intolerably depressed as soon as one is reduced to introspection with no means of diversion.

But Pascal is not merely an old crank berating a younger generation he fails to understand.  Pascal applies the same analysis indiscriminately.  Young or old, rich or poor, male or female — for Pascal it just comes with being human.  “If our condition were truly happy,” he explains, “we should not need to divert ourselves from thinking about it.”  As things stand, however,

What people want is not the easy peaceful life that allows us to think of our unhappy condition, nor the dangers of war, nor the burdens of office, but the agitation that takes our mind off it and diverts us.  That is why we prefer the hunt to the capture.

We need distractions and diversions to keep us from contemplating our true condition, frail and mortal as it is.  For this reason we cannot stand to be alone with our own thoughts and seek to fill every moment with distraction.  Pascal’s view is admittedly rather grim even as it resonates with our experience.  Yet, Pascal knew there was more than this to the human condition.  There was also love and passion, knowledge and creativity, wonder and courage.  Pascal knew this and he insisted that we recognize both the glory and the misery of humanity:

Let man now judge his own worth, let him love himself, for there is within him a nature capable of good; but that is no reason for him to love the vileness within himself.  Let him despise himself because this capacity remains unfilled; but that is no reason for him to despise this natural capacity.  Let him both hate and love himself; he has within him the capacity for knowing truth and being happy, but he possesses no truth which is either abiding or satisfactory.

Pascal insists that we reckon with all that is good and all that is bad in us.  It is our awareness of the possibility of goodness, however, which heightens our misery.  And, yet again, it is our awareness of our misery that is part of our glory.  In the end Pascal believed that “God alone is man’s true good” and Christ the “via veritas.”  With St. Augustine, whose influence permeates Pascal’s thought, he would have prayed, “Our hearts are restless until they find their rest in Thee.”  Perhaps this is why at times something akin to spirituality and the language of worship suffuses our most prominent and powerful diversions.

Augustine and Pascal in turn both helped shape the thought of  2oth century novelist Walker Percy.  Percy blended Pascalian insight with a touch of existentialism in his best known novel The Moviegoer (1960) in which the main character, Binx Bolling, finds himself on a search.  “What is the nature of the search? you ask.”

Really it is very simple, at least for a fellow like me; so simple that it is easily overlooked.  The search is what anyone would undertake if he were not sunk in the everydayness of his own life …. To become aware of the possibility of the search is to be onto something.  Not to be onto something is to be in despair.

Near the middle of the novel throughout which Bolling has been amassing clues he thinks are somehow related to the search, he despairs:

… when I awake, I awake in the grip of everydayness.  Everydayness is the enemy.  No search is possible.  Perhaps there was a time when everydayness was not too strong and one could break its grip by brute strength.  Now nothing breaks it — but disaster.

However, through a rather tortured relationship with a very broken young woman named Kate whom he has come to love, Binx begins to see grace in the ordinary.  Near the very end of the novel, while he and Kate are sitting at a service station discussing marriage and the worries that still fill Kate’s mind, Binx notices a man coming out of a church.  It is Ash Wednesday.  Binx watches while the man sits in his car looking down at something on the seat beside him.  The man’s presence puzzles Binx:

It is impossible to say why he is here.  Is it part and parcel of the complex business of coming up in the world?  Or is it because he believes that God himself is present here at the corner of Elysian Fields and Bons Enfants?  Or is he here for both reasons:  through some dim dazzling trick of grace, coming for the one and receiving the other as God’s own importunate bonus?  It is impossible to say.

In June with sports on my mind, I wondered whether, as Pascal would have it, sports were a mere distraction which facilitated our unwillingness to acknowledge our true condition; or, taking a cue from Percy, whether it might be a rupture of the “everydayness,” the ordinariness of our lives that may awaken us to the possibility of the search.  My sense at the time was that both were on to something, each was a possibility.  Sports can be merely a distraction conducive to living in bad faith in denial of the truth of our situation.  But at times bursts of grace and beauty appeared suddenly and unexpectedly in the midst of our diversion to remind us that we ought to be searching for their source.  “Through some dim dazzling trick of grace, coming for the one” we receive “the other as God’s own importunate bonus.”

Thinking now about the distractions enabled by the Internet, social media, smart phones, and all the rest I wonder if something like the same analysis might also apply.  Do we embrace these distractions as a way of refusing silence and contemplation because we do not care to entertain the thoughts that may come?  Perhaps.  Surely more than this is going on.  Sometimes a moment of carefree distraction is just that.  Is it possible that coming for distraction we might find something more — a real connection with another human being, a new insight, real wisdom, genuine laughter?

I am not so much of a pessimist that I would discount such possibilities.  But I do fear that more often than not our distractions, as Pascal would put it, are diversions that keep us from considering our true condition. They are part of the “everydayness” of life that is the enemy of the search and might even hide from us the possibility of the search.   To give up on the search, to be unaware of it, is to be in despair. If it doesn’t feel like despair, is it because, as Kierkegaard put it in a line that opens The Moviegoer, “… the specific character of despair is precisely this:  it is unaware of being despair”?

Perhaps it is also because we are too distracted to notice.  We are the “diverted selves” Percy described in Lost in the Cosmos: The Last Self-Help Book,

In a free and affluent society, self is free to divert itself endlessly from itself.  It works in order to enjoy the diversions that the fruit of one’s labor can purchase.  The pursuit of happiness becomes the pursuit of diversion …

The Cost of Distraction: What Kurt Vonnegut Knew

“The year was 2081, and everybody was finally equal.”

So begins the late Kurt Vonnegut’s 1961 short story, “Harrison Bergeron.” In 2009, Chandler Tuttle released a 25 minute film version of the story titled 2081, and you can watch the trailer at the end of this post.

Vonnegut goes on to describe the conditions of this equality:

They weren’t only equal before God and the law. They were equal every which way. Nobody was smarter than anybody else. Nobody was better looking than anybody else. Nobody was stronger or quicker than anybody else. All this equality was due to the 211th, 212th, and 213th Amendments to the Constitution, and to the unceasing vigilance of agents of the United States Handicapper General.

This government enforced equality was achieved by imposing prosthetic technologies on those who were above average; these prosthetics, however, were designed not to enhance, but to diminish.  So, for example, ballerinas who might otherwise rise above their peers in grace, elegance and beauty,

were burdened with sashweights and bags of birdshot, and their faces were masked, so that no one, seeing a free and graceful gesture or a pretty face, would feel like something the cat drug in.

Then there were those of above average intelligence like the title character’s father, George Bergeron.

[He] had a little mental handicap radio in his ear. He was required by law to wear it at all times. It was tuned to a government transmitter. Every twenty seconds or so, the transmitter would send out some sharp noise to keep people like George from taking unfair advantage of their brains.

Whenever George began to formulate a complex idea, which often involved questioning the status quo, a sharp, piercing noise would shoot in his ear distracting him and derailing his train of thought.  Sometimes the noise was like a siren going off, other times “like somebody hitting a milk bottle with a ball peen hammer.”  Regular and incessant, the distraction overwhelmed and undermined natural intelligence.

George’s son, Harrison Bergeron possessed gifts and abilities that rendered him an especially potent threat to the regime of equality.  Because of this he was taken away and locked up by the authorities when he was fourteen.  Midway through the story, however, as George and his wife Hazel watch encumbered ballerinas dancing on television, a news bulletin interrupts the performance.  A ballerina takes over from a stuttering  announcer to read the bulletin.

“Harrison Bergeron, age fourteen,” she said in a grackle squawk, “has just escaped from jail, where he was held on suspicion of plotting to overthrow the government. He is a genius and an athlete, is under-handicapped, and should be regarded as extremely dangerous.”

Shortly thereafter, Harrison, whose debilitating prosthetics made him look “like a walking junkyard,” bursts into the building.  He effortlessly rips off the multiple “handicaps” that had been attached to his body in an unsuccessful effort to equalize his prodigious strength and ability.  He then proclaims himself emperor, declaring to the wonder-struck onlookers, “Now watch me become what I can become!”

Having been joined by a beautiful ballerina who came forward to be his empress, they dance.  They dance majestically and preternaturally breaking not only “the laws of the land,” but “the law of gravity and the laws of motion as well.”  And while they danced so high they kissed the ceiling,

Diana Moon Glampers, the Handicapper General, came into the studio with a double-barreled ten-gauge shotgun. She fired twice, and the Emperor and the Empress were dead before they hit the floor.

And just like that, equality is restored.

“Harrison Bergeron” is a nicely executed short story that can be read from a number of perspectives, yielding insights that can be variously applied to political, economic, or cultural circumstances.  As I read it, the story shares a certain sensibility with both Orwell’s 1984 and Huxley’s Brave New World.  It has been noted by, among others, Huxley himself that Brave New World pictures a more likely image of the future because it is not posited on a heavy-handed totalitarianism.  It is, rather, a freely embraced dystopia.  I want to suggest that, in “Harrison Bergeron,” Vonnegut offered us an Orwellian adumbration of one particular dimension of our Internet soaked world that has in fact emerged along a more Huxleyian trajectory.

Consider the manner in which the advantages of the intellectually gifted are equalized in Vonnegut’s story — distraction, regular and constant distraction.  The story provides a vivid and disturbing image of the consequences of perpetual distraction.

We’ve noted more than a few critics who have been pointing to the costly consequences of living with the perpetual distractions created by the very nature of the Internet and the ubiquity of portable tools which allow us to be always connected, always accessible.   Recently a group of neuroscientists made news by taking a trip into the Utah wilderness to disconnect long enough to appreciate the mental costs of constant connectivity and the perpetual distraction that comes with it.

In the world of 2081 imagined by Vonnegut, the distracting technology  is ruthlessly imposed by a government agency.  We, however, have more or less happily assimilated ourselves to a way of life that provides us with regular and constant distraction.  We have done so because we tend to see our tools as enhancements.  They promise, and often provide, pleasure, comfort, efficiency, and productivity.  What’s more, our distractions are not nearly so jarring as those that afflict the characters in “Harrison Bergeron”; in fact, our distractions can often be quite pleasant.

But might they also be inhibiting the development of our fullest potential?  Are we trading away certain real and important pleasures and possibilities?  Have we adopted technologies that in their democratizing power, also engender mediocrity? Do our perpetual distractions constitute a serious impairment of our cognitive abilities?  Can we learn to use our tools in a way that mitigates the costs?

These are just a few of the questions suggested by “Harrison Bergeron.”  Our future, at least in part, may hinge on the answers.

“You have to be somebody before you can share yourself”

Will the Internet make it impossible to make clean starts in life?  Will every word and every picture we have posted, however ill advised, find a way to haunt us?  This is the fear legal scholar Jeffrey Rosen articulated in his NY Times piece, “The Web Means the End of Forgetting.” That is the same piece that led me to wonder several days ago if we might need a new economic statistic to track social media induced unemployment.

According to David Dylan Thomas at In Medias Res, the real problem might actually be the opposite of what Rosen feared.  In a post titled “The Myth of Online Inertia,” Thomas argues that “things disappear from the web” all the time.  The evolution of hardware and file formatting renders much of what is produced potentially inaccessible with the passage of time.  “As a web manager,” Thomas explains,

I’ve overseen the overhaul of many a content management system, and there’s always a compatibility issue which forces editors and technology teams to ask the same question. How much? How much will it cost (in time and money) to convert how much information? Do we really want to bother reformatting 400 news stories that were published in 2000 to a whole new format on the off chance that someone will search for them? The answer is almost always no. And that’s just 10 years.

My sense is that both Rosen and Thomas are on to something and that if they were to sit down together to discuss their positions, a synthesis preserving elements of both arguments would emerge.  Regardless of how powerful the Internet’s long term memory proves to be, however, its short term memory is quite good and potentially quite damaging.  Consequently, we are becoming increasingly self-conscious and cautious about what we post and where.

Some are concerned enough to implement tools such as Google Goggles that are designed to keep us from sending that stupid drunken email that ends up costing us our job or a relationship.  A great deal of time and money is also being spent to keep individuals from not only ruining their own reputations with a misguided tweet, but also tarnishing the image of the institutions with which they are associated.  In a recent story about the effort colleges are putting forth to manage the social media activities of their student athletes, a consultant gave the very basic rule he tries to instill in student athletes:  if you would have a problem with your mother reading or seeing it, don’t post it.

This is good advice as far as it goes, I suppose.  Although, it would depend a great deal, wouldn’t it, on the sensibilities of each particular mom.  In any case, this all brought to mind a recent article in The New Republic by Jed Perl.  In “Alone, With Words,” Perl laments the loss of writing that  begins as and remains a private act.

Writing, before it is anything else, is a way of clarifying one’s thoughts. This is obviously true of forms such as the diary, which are inherently solitary. But even those of us who write for publication can conclude, once we have clarified certain thoughts, that these thoughts are not especially valuable, or are not entirely convincing, or perhaps are simply not thoughts we want to share with others, at least not now … I believe that most writing worth reading is the product, at least to some degree, of this extraordinarily intimate confrontation between the disorderly impressions in the writer’s mind and the more or less orderly procession of words that the writer manages to produce on the page.

Most of what is made public in the arena of social media was never private in Perl’s sense, at least not for very long at all.  We are becoming used to the idea of providing a more or less real time feed of our thoughts and actions to the world.  The process of clarification and crafting that Perl describes has been replaced by the urge to publicize immediately.  Little wonder then that some of what we make public is damning and much of it is quite inane.

Citing Jaron Lanier, Alan Jacobs makes a point that we seem to have forgotten:

“You have to be somebody before you can share yourself.” And the process of becoming somebody takes time, effort, discipline, and study.

That process also tends to happen when we have preserved  a certain private space for our selves.  Social media and the Internet have given us an unparalleled ability to make our thoughts, our writing, our  pictures, our very selves public.  Our task now may be to carve out and preserve a private space that will help render what we make public meaningful and worthwhile.  Or, at least not potentially disastrous.

Dot, Dot, Dot

Recently it struck me that I have been using ellipses (. . .) quite a bit in my informal writing.  Like most people I compose at least a few emails each day and while, by most standards I am an infrequent texter, I do send out a modest amount.   In both of these formats I’ve been dot, dot, dotting left and right.  And after sharing this observation with a few people and following up with a quick search on Goolge it became clear that I wasn’t alone.  The ellipsis is the darling of new media.  So this led me to wonder if there was some significance in this grammatical development, or if it was just some combination of convenience and coincidence.  An intuition of sorts formed in my gut telling me there was something deeper going on.  So why is it this particular mark of punctuation has suddenly become so prominent?

Within the world of emails, texts, and chat rooms a certain grammar has evolved.  It’s not the King’s English, but it isn’t quite anarchy either.  Rules and established usages have emerged, and within this emerging grammar the ellipsis functions in certain defined ways.  For example, it can signal that the sender is still awaiting a reply after an unusually long break in a text exchange, as if to say “Still waiting . . .”  Or, it can signal that another text will follow to complete the present thought, “Hold on, more to come . . .”  And in some other cases still it may be used to express awkward silence . . .

As I kept thinking about my own use of the ellipse I realized I was also using the dots in some more subtle ways.  For example, Seinfeld came to mind.  This isn’t all that unusual, after all the show about nothing sometimes appeared to be about everything.  So it struck me that the dots sometimes functions in much the same way as the phrase “yada, yada, yada” made famous on the sitcom, as a way of saying “etc. etc.” with a certain bored indifference. At still other times I was using the ellipsis as a stream of consciousness device, stringing together thoughts that may not be formally or self-evidently related, but that nevertheless flow one from another in some weird associative way . . . in my mind.  And as that last line suggests, sometimes one may use the ellipsis, as my wife noted, as a way of getting the reader to read in a dramatic pause, often for comedic effect.

All of these ways of using the ellipsis, however, were not getting at my gut instinct.  These were all still fairly utilitarian uses of the mark, but I sensed that something more was going on.  I suspected the dots somehow signaled some shift in our way of thinking and expressing ourselves, that perhaps it was a symptom of our cultural condition surfacing through our writing.  Then it dawned on me.  I realized that at times I used the ellipsis to communicate a certain vagueness and ambiguity in what I was saying.  I used the dots to convey hesitancy and indeterminacy.  It was the mark of a thought that refused to assert itself.

Classic example:  On Facebook, where ellipses run wild, I might post a link on someone’s wall with the note, “Thought you might like this . . .”  If you were to put what the ellipsis communicates into words you would get,

Thought you might likes this, maybe you will, maybe you won’t, I’m not too sure exactly, actually I don’t like it so much myself, well maybe a little, but don’t think me stupid if you don’t like it …

The ellipsis gives expression to a habit of ironic detachment and preemptive indifference.  And here is where I found the point of contact with larger cultural trends.  The mood of ironic detachment that has settled over so many of us was manifesting itself in three simple dots.  With those dots we were evading conviction, giving off an apathetic vibe, and guarding ourselves from seeming unfashionably earnest.

Thinking about the ellipsis brought to mind a performance by Taylor Mali, “Speak With Conviction.” It’s meant to be heard so watch the video below, but here is the part that comes to mind:

Declarative sentences … so called, because, they used to you know … declare things to be true … ok … as opposed to other things that are like totally … you know … not … They’ve been infected by this tragically cool and totally hip interrogative tone … as if I’m saying, “Don’t think I’m a nerd just ‘cuz I’ve like noticed this okay … I have nothing personally invested in my own opinions … I’m just like inviting you to join me on the bandwagon of my own uncertainty …”

In writing the ellipsis captures nicely the tone that Mali identifies and lampoons in his performance.  These three dots are the punctuation mark of an indeterminate age.  We are becoming Eliot’s hollow men and this is the way each thought ends,

Not with a bang but a whimper.

I think . . .


If you’ve appreciated what you’ve read, consider supporting the writer.