The Language(s) of Digital Media Platforms

What follows is a thought experiment.  Comments/criticisms welcome.

In an influential 2001 book, The Language of New Media, theorist Lev Manovich presented his “attempt at both a record and a theory of the present” with regards to digital media.  He explains that his “aim” is to “describe and understand the logic driving the development of the language of new media.”  But he is quick to add,

I am not claiming that there is a single language of new media.  I use “language” as an umbrella term to refer to a number of various conventions used by designers of new media objects to organize data and structure the user’s experience.

The final product is an engaging and provocative study.  For the moment, however, I want to reflect on the notion of a “language” of digital media — it’s a suggestive metaphor.  Early in the book, Manovich explains his rationale for the term,

I do not want to suggest that we need to return to the structuralist phase of semiotics in understanding new media.  However, given that most studies of new media and cyberculture focus on their sociological, economic, and political dimensions, it was important for me to use the word language to signal the different focus of this work:  the emergent conventions, recurrent design patterns, and key forms of new media.

Manovich states explicitly that he is not claiming that there is a single, monolithic language of new media.  At a recent conference, media anthropologist John Postill made a similar point.  We do not have, he suggested,

a totalising ephocal ‘logic’ but rather ever more differentiated Internet ‘technologies, practices, contexts’ ([Miller and Slater] 2000: 3). The evidence provided in the reviewed texts strongly suggests that the Internet – and indeed the world – is becoming ever more plural and that no universal ‘logic of practice’ … is gaining ascendancy at the expense of all other logics.

I take his “logics” to be roughly parallel to Manovich’s “language,” although Postill is focusing on the practices that emerge from digital media, less so on the internal logic of the given platform.  The two, however, are surely interrelated.  So while we do not have a single language of digital media, we may still speak of languages or logics of particular platforms or interfaces.  Now in an associative leap, I want to connect this with the recent conversations surrounding Guy Deutscher’s Through the Language Glass: Why the World Looks Different in Other Languages.  Judging from reviews and interviews, I have not yet read the book, Deutscher has written a fascinating study.  More specifically though, it is his defense of linguist Roman Jakobson’s maxim concerning the difference languages make that I want to think with here.  According to Jakobson, “Languages differ essentially in what they must convey and not in what they may convey.”  In other words, languages do not necessarily constrain a native speaker’s ability to think or comprehend certain concepts, but languages do force their speakers to make certain things explicit.  In Deutscher’s words,

Languages differ in what types of information they force the speakers to mention when they describe the world. (For example, some languages require you to be more specific about gender than English does, while English requires you to be more specific about tense than some other languages. Some require you to be more specific about color differences, and so on.) And it turns out that if your language routinely obliges you to express certain information whenever you open your mouth; it forces you to pay attention to certain types of information and to certain aspects of experience that speakers of other languages may not need to be so attentive to. These habits of speech can then create habits of mind that go beyond mere speech, and affect things like memory, attention, association, even practical skills like orientation.

Now what if we press the language of digital media platforms/interfaces metaphor and ask if the Jakobson principle holds?  My initial thought is that something like the inverse of Jakobson’s principle ends up being more useful.  I could be wrong here, this is just an initial refection, but what seems most interesting about a particular platform is its specific limitations and how the user is constrained to work (often imaginatively) within those constraints.  Consider as an example Twitter’s 140 character limit or the limited symbols available for text messages.  Facebook allows greater flexibility and more media options for communication, but it is still limited.  Second Life has its own logic or language with its own particular possibilities and limitations.  And so on.

These limits are, of course, inevitable.  Every medium has its limits, nothing new there.  Yet it is worth asking what these limits are because there is always an implicit risk in becoming habituated to communication with a given medium and internalizing these limitations.  Both Manovich and Deutscher allude to this possibility.  In the excerpt above, Deutscher suggests that, “These habits of speech can then create habits of mind that go beyond mere speech, and affect things like memory, attention, association, even practical skills like orientation.”  For his part Manovich, considering the way the “language” of new media objectifies the mind’s operations, concludes,

. . . we are asked to follow pre-programmed, objectively existing associations.  Put differently, in what can be read as an updated version of French philosopher Louis Althusser’s concept of “interpellation,” we are asked to mistake the structure of somebody else’s mind for our own . . . . The cultural technologies of an industrial society — cinema and fashion — asked us to identify with someone else’s bodily image.  Interactive media ask us to identify with someone else’s mental structure.  If the cinema viewer, male and female, lusted after and tried to emulate the body of the movie star, the computer user is asked to follow the mental trajectory of the new media designer.

So to sum up:

Digital media platforms exhibit something like a particular language or logic.

Borrowing and tweaking Jakobson’s maxim, “Languages of digital media platforms differ essentially in what they cannot (or, encourage us not to) convey and not in what they may convey.”

For consideration:  What assumptions and limitations are internalized by the habitual use of particular digital media platforms?  What communicative structures could we be internalizing and what are their limitations?  Do we then import these limitations into other areas of our thinking and communication in the world?

Comments welcome.

“The storm is what we call progress”

Via Alan Jacobs at Text Patterns, I read the following excerpt from Arikia Millikan’s short piece “I Am a Cyborg and I Want My Google Implant Already” on The Atlantic’s web site:

By the time I finished elementary school, writing letters to communicate across great distances was an archaic practice. When I graduated middle school, pirating music on Napster was the norm; to purchase was a fool’s errand. At the beginning of high school, it still may have been standard practice to manually look up the answer to a burning question (or simply be content without knowing the answer). Internet connection speeds and search algorithms improved steadily over the next four years such that when I graduated in the class of 2004, having to wait longer than a minute to retrieve an answer was an unbearable annoyance and only happened on road trips or nature walks. The summer before my freshman year of college was the year the Facebook was released to a select 15 universities, and almost every single relationship formed in the subsequent four years was prefaced by a flood of intimate personal information.

Now, I am always connected to the Web. The rare exceptions to the rule cause excruciating anxiety. I work online. I play online. I have sex online. I sleep with my smartphone at the foot of my bed and wake up every few hours to check my email in my sleep (something I like to call dreamailing).

But it’s not enough connectivity. I crave an existence where batteries never die, wireless connections never fail, and the time between asking a question and having the answer is approximately zero. If I could be jacked in at every waking hour of the day, I would, and I think a lot of my peers would do the same. So Hal, please hurry up with that Google implant. We’re getting antsy.

Well, hard to beat honesty I suppose.  I did find it slightly ironic that the Google executive who is interviewed for this piece was named Hal.

Jacobs aptly titled his post “The saddest thing I have read in some time,” and he added simply, “There’s a name for this condition: Stockholm Syndrome.”  Well put, of course.

Perhaps it was reading that piece that prepared me to read Walter Benjamin’s IX Thesis on the Philosophy of History later on that day with a certain melancholy resonance:

A Klee painting named “Angelus Novus” shows an angel looking as though he is about to move away from something he is fixedly contemplating.  His eyes are staring, his mouth is open, his wings are spread.  This is how one pictures the angel of history.  His face is turned toward the past.  Where we perceive a chain of events, he sees one single catastrophe  which keeps piling wreckage upon wreckage and hurls it in front of his feet.  The angel would like to stay, awaken the dead, and make whole what has been smashed.  But a storm is blowing from Paradise; it has got caught in his wings with such violence that the angel can no longer close them.  This storm irresistibly propels him into the future to which his back is turned, while the pile of debris before him grows skyward.  The storm is what we call progress.

In any case, I tend to agree with Jacobs — it was rather sad.

When Words and Action Part Company

I’ve not been one to jump on the Malcolm Gladwell bandwagon; I can’t quite get past the disconcerting hair.  That said, his recent piece in The New Yorker, “Small Change:  Why the revolution will not be tweeted,” makes a compelling case for the limits of social media when it comes to generating social action.

Gladwell frames his piece as a study in contrasts.  He begins by recounting the evolution of the 1960 sit-in movement that began when four freshmen from North Carolina A & T sat down and ordered coffee at the lunch counter of the local Woolworth’s and refused to move when the waitress insisted, “We don’t serve Negroes here.”  Within days the protest grew and spread across state lines and tensions mounted.

Some seventy thousand students eventually took part. Thousands were arrested and untold thousands more radicalized. These events in the early sixties became a civil-rights war that engulfed the South for the rest of the decade—and it happened without e-mail, texting, Facebook, or Twitter.

Almost reflexively now, the devotees of social media power will trot out the Twitter-enabled 2009 Iranian protests as an example of what social media can do.  Gladwell, anticipating as much, quotes Mark Pfeifle, a former national-security adviser, who believes that, “Without Twitter the people of Iran would not have felt empowered and confident to stand up for freedom and democracy.”  Pfeifle went so far as to call for Twitter’s nomination for the Nobel Peace Prize.  That is a bit of a stretch one is inclined to believe, and Gladwell explains why:

In the Iranian case … the people tweeting about the demonstrations were almost all in the West. “It is time to get Twitter’s role in the events in Iran right,” Golnaz Esfandiari wrote, this past summer, in Foreign Policy. “Simply put: There was no Twitter Revolution inside Iran.” The cadre of prominent bloggers, like Andrew Sullivan, who championed the role of social media in Iran, Esfandiari continued, misunderstood the situation. “Western journalists who couldn’t reach—or didn’t bother reaching?—people on the ground in Iran simply scrolled through the English-language tweets post with tag #iranelection,” she wrote. “Through it all, no one seemed to wonder why people trying to coordinate protests in Iran would be writing in any language other than Farsi.”

You can read the Foreign Policy article by Esfandiari Gladwell, “Misreading Tehran:  The Twitter Devolution,” online.   Gladwell argues that social media is unable to promote significant and lasting social change because they foster weak rather than strong-tie relationships.  Promoting and achieving social change very often means coming up against entrenched cultural norms and standards that will not easily give way.  And as we know from the civil rights movement, the resistance is often violent.  As Gladwell reminds us,

. . . Within days of arriving in Mississippi, three [Freedom Summer Project] volunteers—Michael Schwerner, James Chaney, and Andrew Goodman—were kidnapped and killed, and, during the rest of the summer, thirty-seven black churches were set on fire and dozens of safe houses were bombed; volunteers were beaten, shot at, arrested, and trailed by pickup trucks full of armed men. A quarter of those in the program dropped out. Activism that challenges the status quo—that attacks deeply rooted problems—is not for the faint of heart.

A subsequent study of the participants in the Freedom Schools was conducted by Doug McAdam:

“All  of the applicants—participants and withdrawals alike—emerge as highly committed, articulate supporters of the goals and values of the summer program,” he concluded. What mattered more was an applicant’s degree of personal connection to the civil-rights movement . . . . [P]articipants were far more likely than dropouts to have close friends who were also going to Mississippi. High-risk activism, McAdam concluded, is a “strong-tie” phenomenon.

Gladwell also goes on to explain why hierarchy, another feature typically absent from social media activism, is indispensable to successful movements while taking some shots along the way at Clay Shirky’s much more optimistic view of social media outlined in Here Comes Everybody: The Power of Organizing Without Organizations.

Not suprisingly, Gladwell’s piece has been making the rounds online the past few days. In response to Gladwell, Jonah Lehrer posted “Weak Ties, Twitter and the Revolution” on his blog The Frontal Cortex.  Lehrer begins by granting, “These are all worthwhile and important points, and a necessary correction to the (over)hyping of Twitter and Facebook.”  But he believes Gladwell has erred in the other direction.  Basing his comments on Mark Granovetter’s 1973 paper, “The Strength of Weak Ties,” Lehrer concludes:

. . . I would quibble with Gladwell’s wholesale rejection of weak ties as a means of building a social movement. (I have some issues with Shirky, too.) It turns out that such distant relationships aren’t just useful for getting jobs or spreading trends or sharing information. According to Granovetter, they might also help us fight back against the Man, or at least the redevelopment agency.

Read the whole post to get the full argument and definitely read Lehrer’s excellent review of Shirky’s book linked in the quotation above.  Essentially Lehrer is offering a kind of middle ground between Shirky and Gladwell.  Since I tend toward mediating positions myself, I think he makes a valid point; but I do lean toward Gladwell’s end of the spectrum nonetheless.

Here, however, is one more angle on the issue:  perhaps the factors working against the potential of social media are not only inherent in the form itself, but also a condition of society that predates the arrival of digital media by generations.  In The Human Condition, Hannah Arendt argued that power, the kind of power to transform society that Gladwell has in view,

. . . is actualized only where word and deed have not parted company, where words are  not empty and deeds not brutal, where words are not used to veil intentions but to disclose realities, and deeds are not used to violate and destroy but to establish relations and create new realities.

Arendt made that claim in the late 1950’s and she argued that even then words and deeds had been drifting apart for some time.  I suspect that since then the chasm has yawned ever wider and that social media participates in and reinforces that disjunction.  It would be unfair, however, to single out social media since the problem extends to most forms of public discourse, of which social media is but one example.

In The Disenchantment of Secular Discourse, Steven D. Smith argues that

It is hardly an exaggeration to say that the very point of ‘public reason’ is to keep the public discourse shallow – to keep it from drowning in the perilous depths of questions about ‘the nature of the universe,’ or ‘the end and object of life,’ or other tenets of our comprehensive doctrines.

If Smith is right — you can read Stanley Fish’s review in the NY Times to get more of a feel for his argument — social media already operate within a context in which the habits of public discourse have undermined our ability to take words seriously.  To put it another way, the assumptions shaping our public discourse encourage the divorce of words and deeds by stripping our language of its appeal to the deeper moral and metaphysical resources necessary to compel social action.  We tend to get stuck in the analysis and pseudo-debate without ever getting to action. As Fish puts it:

While secular discourse, in the form of statistical analyses, controlled experiments and rational decision-trees, can yield banks of data that can then be subdivided and refined in more ways than we can count, it cannot tell us what that data means or what to do with it . . . . Once the world is no longer assumed to be informed by some presiding meaning or spirit (associated either with a theology or an undoubted philosophical first principle) . . . there is no way, says Smith, to look at it and answer normative questions, questions like “what are we supposed to do?” and “at the behest of who or what are we to do it?”

Combine this with Kierkegaard’s 19th century observations about the Press that now appear all the more applicable to the digital world.  Consider the following summary of Kierkegaard’s fears offered by Hubert Dreyfus in his little book On the Internet:

. . . the new massive distribution of desituated information was making every sort of information immediately available to anyone, thereby producing a desituated, detached spectator.  Thus, the new power of the press to disseminate information to everyone in a nation led its readers to transcend their local, personal involvement . . . . Kierkegaard saw that the public sphere was destined to become a detached world in which everyone had an opinion about and commented on all public matters without needing any first-hand experience and without having or wanting any responsibility.

Kierkegaard suggested the following motto for the press:

Here men are demoralized in the shortest possible time on the largest possible scale, at the cheapest possible price.

I’ll let you decide whether or not that motto may be applied even more aptly to existing media conditions.  In any case, the situation Kierkegaard believed was created by the daily print press in his own day is at least a more likely possibility today.  A globally connected communications environment geared toward creating a constant, instantaneous, and indiscriminate flow of information, together with the assumptions of public discourse described by Smith, numbs us into docile indifference — an indifference social media may be powerless to overthrow, particularly when the stakes are high.  We are offered instead the illusion of action and involvement, the sense of participation in the debate.  But there is no meaningful debate, and by next week the issue, whatever the issue is, will still be there, and we’ll be busy discussing the next thing.  Meanwhile action walks further down a lonely path, long since parted from words.

“Are you really there?” — How not to become specatators of our lives

If you try to keep up with the ongoing debate regarding the Internet and the way it is shaping our world and our minds, you will inevitably come across the work of Jaron Lanier.  When you do, stop and take note.  Lanier qualifies as an Internet pessimist in Adam Thierer’s breakdown of The Great Debate over Technology’s Impact on Society, but he is an insightful pessimist with a long history in the tech industry.  Unlike other, often insightful, critics such as the late Neil Postman and Nicholas Carr, Lanier speaks with an insider’s perspective.  We noted his most recent book, You Are Not a Gadget: A Manifesto,not long ago.

Earlier this week, I ran across a short piece Lanier contributed to The Chronicle of Higher Education in response to the question, “What will be the defining idea of the coming decade, and why?” Lanier’s response, cheerfully titled “The End of Human Specialness,” was one of a number of responses solicited by The Chronicle from leading scholars and illustrators.  In his piece, Lanier recalls addressing the “common practice of students blogging, networking, or tweeting while listening to a speaker” and telling his audience at the time,

The most important reason to stop multitasking so much isn’t to make me feel respected, but to make you exist. If you listen first, and write later, then whatever you write will have had time to filter through your brain, and you’ll be in what you say. This is what makes you exist. If you are only a reflector of information, are you really there?

We have all experienced it; we know exactly what Lanier is talking about.  We’ve seen it happen, we’ve had it happen to us, and — let’s be honest — we have probably also been the offending party.  Typically this topic elicits a rant against the incivility and lack of respect such actions communicate to those who are on the receiving end, and that is not  unjustified.  What struck me about Lanier’s framing of the issue, however, was the emphasis on the person engaged in the habitual multitasking and not on the affront to the one whose presence is being ignored.

We are virtually dispersed people.  Our bodies are in one place, but our attention is in a dozen other places and, thus,  nowhere at all.  This is not entirely new; there are antecedents.  Long before smart phones enabled a steady flow of distraction and allowed us to carry on multiple interactions simultaneously, we wandered away into the daydreams our imagination conjured up for us.  My sense, however, is that such retreats into our consciousness are a different sort of  thing than our media enabled evacuations of the place and moment we inhabit.  For one thing, they were not nearly so frequent and intrusive. We might also argue that when we daydream our attention is in fact quite focused in one place, the place of our dream.  We are somewhere rather than nowhere.

Whatever we think of the antecedents, however, it is clear that many of us are finding it increasingly difficult to be fully present in our own experience.  Perhaps part of the what is going on is captured by the old adage about the man with a hammer to whom everything looks like a nail.  My most vivid experience with this dynamic came years ago with my first digital camera.  To the person with a digital camera (and enough memory), I discovered, everything looks like a picture and you can’t help but take it.  I have wonderful pictures of Italy, but very few memories.  And so we may  extrapolate:  to the person with a Twitter account, everything is a tweet waiting to be condensed into 140 characters.  To the person with a video recorder on their phone, everything is a moment to be documented.  To the person with an iPhone … well, pick the App.

In an article written by professor Barry Mauer, I recently learned about Andy Warhol’s obsessive documentation of his own experience through photographs, audiotape, videotape, and film.  In his biography of Warhol, Victor Bockris writes,

Indeed, Andy’s desire to record everything around him had become a mania.  As John Perrault, the art critic, wrote in a profile of Warhol in Vogue:  “His portable tape recorder, housed in a black briefcase, is his latest self-protection device.  The microphone is pointed at anyone who approaches, turning the situation into a theater work.  He records hours of tape every day but just files the reels away and never listens to them.”

Warhol’s behavior would, I suspect, seem less problematic today.  Here too he was perhaps simply ahead of his time.  Given much more efficient tools, we are also obsessively documenting our lives.  But what most people do tends to be viewed as normal.  It is interesting, though, that Perrault referred to Warhol’s tape recorder as a “self-protective device.”  It called to mind R. R. Reno’s analysis of the pose of ironic detachment so characteristic of our society:

We enjoy an irony that does not seek resolution because it supports our desire to be invulnerable observers rather than participants at risk. We are spectators of our lives, free from the strain of drama and the uncertainty of a story in which our souls are at stake.

“Spectators of our lives.”  The phrase is arresting, and the prospect is unsettling.  But it is hardly necessary or inevitable.  If the cost of re-engaging our own lives, of becoming participants at risk in the unfolding drama of our own story is a few less photos that we may end up deleting anyway, one less Facebook update from our phone, or one text left unread for a short while, then that is a price well worth paying.  We will be better for it, and those others, in whose presence we daily live, will be as well.

The Search

Kurt Vonnegut’s “Harrison Bergeron” presented us with a striking illustration of the potentially debilitating consequences of the constant distraction.  In that story the distraction is brutally imposed; but, as we noted last week, we choose our distractions.  In fact, we embrace our Internet-empowered distractions.  We love to be distracted and we crave diversion.  We can hardly stand it if we are without distraction or diversion for more than a few moments at a time.  We complain incessantly about our busyness, but were it all to stop we would hardly know what to do with ourselves.  This raises some interesting questions.  Why are we so keen to envelope ourselves in constant distraction?  Why do some of us develop an addictive relationship to the constant flow of distraction?  Why are we so uneasy when the distractions stop?

Back in June, I reflected on the theme of distraction and diversion on the heels of a post about the religious aura that sometimes surrounds our love affair with sports.  We were then, you will remember, at the height of World Cup fever.  I want to revisit some of those same thoughts and tweak them just a little bit as a follow up to Friday’s post on distraction and “Harrison Bergeron.”

Distractedness and the need for diversion are not new phenomenon of course.  Although the condition may now be intensified and heightened, it has been with us at least since the 17th century, and almost certainly before then.  It was in the 17th century that Blaise Pascal began assembling a series of notes on scraps of paper in preparation for a book he never wrote.  When he died at the age of 39 he left behind hundreds of barely organized notes which were later collected and published under the French title Pensées, or thoughts.  Pascal is today remembered, if at all, either for his law of fluid pressure or an argument for God’s existence known as Pascal’s Wager.  Neither quite does justice to the depth of his insight into what we used to call the human condition.

Pascal knew that we needed our diversions and distractions and that without them we would be miserable.  His description of the younger generation sounds wholly contemporary:

Anyone who does not see the vanity of the world is very vain himself.  So who does not see it, apart from young people whose lives are all noise, diversions, and thoughts for the future?  But take away their diversion and you will see them bored to extinction.  Then they feel their nullity without recognizing it, for nothing could be more wretched than to be intolerably depressed as soon as one is reduced to introspection with no means of diversion.

But Pascal is not merely an old crank berating a younger generation he fails to understand.  Pascal applies the same analysis indiscriminately.  Young or old, rich or poor, male or female — for Pascal it just comes with being human.  “If our condition were truly happy,” he explains, “we should not need to divert ourselves from thinking about it.”  As things stand, however,

What people want is not the easy peaceful life that allows us to think of our unhappy condition, nor the dangers of war, nor the burdens of office, but the agitation that takes our mind off it and diverts us.  That is why we prefer the hunt to the capture.

We need distractions and diversions to keep us from contemplating our true condition, frail and mortal as it is.  For this reason we cannot stand to be alone with our own thoughts and seek to fill every moment with distraction.  Pascal’s view is admittedly rather grim even as it resonates with our experience.  Yet, Pascal knew there was more than this to the human condition.  There was also love and passion, knowledge and creativity, wonder and courage.  Pascal knew this and he insisted that we recognize both the glory and the misery of humanity:

Let man now judge his own worth, let him love himself, for there is within him a nature capable of good; but that is no reason for him to love the vileness within himself.  Let him despise himself because this capacity remains unfilled; but that is no reason for him to despise this natural capacity.  Let him both hate and love himself; he has within him the capacity for knowing truth and being happy, but he possesses no truth which is either abiding or satisfactory.

Pascal insists that we reckon with all that is good and all that is bad in us.  It is our awareness of the possibility of goodness, however, which heightens our misery.  And, yet again, it is our awareness of our misery that is part of our glory.  In the end Pascal believed that “God alone is man’s true good” and Christ the “via veritas.”  With St. Augustine, whose influence permeates Pascal’s thought, he would have prayed, “Our hearts are restless until they find their rest in Thee.”  Perhaps this is why at times something akin to spirituality and the language of worship suffuses our most prominent and powerful diversions.

Augustine and Pascal in turn both helped shape the thought of  2oth century novelist Walker Percy.  Percy blended Pascalian insight with a touch of existentialism in his best known novel The Moviegoer (1960) in which the main character, Binx Bolling, finds himself on a search.  “What is the nature of the search? you ask.”

Really it is very simple, at least for a fellow like me; so simple that it is easily overlooked.  The search is what anyone would undertake if he were not sunk in the everydayness of his own life …. To become aware of the possibility of the search is to be onto something.  Not to be onto something is to be in despair.

Near the middle of the novel throughout which Bolling has been amassing clues he thinks are somehow related to the search, he despairs:

… when I awake, I awake in the grip of everydayness.  Everydayness is the enemy.  No search is possible.  Perhaps there was a time when everydayness was not too strong and one could break its grip by brute strength.  Now nothing breaks it — but disaster.

However, through a rather tortured relationship with a very broken young woman named Kate whom he has come to love, Binx begins to see grace in the ordinary.  Near the very end of the novel, while he and Kate are sitting at a service station discussing marriage and the worries that still fill Kate’s mind, Binx notices a man coming out of a church.  It is Ash Wednesday.  Binx watches while the man sits in his car looking down at something on the seat beside him.  The man’s presence puzzles Binx:

It is impossible to say why he is here.  Is it part and parcel of the complex business of coming up in the world?  Or is it because he believes that God himself is present here at the corner of Elysian Fields and Bons Enfants?  Or is he here for both reasons:  through some dim dazzling trick of grace, coming for the one and receiving the other as God’s own importunate bonus?  It is impossible to say.

In June with sports on my mind, I wondered whether, as Pascal would have it, sports were a mere distraction which facilitated our unwillingness to acknowledge our true condition; or, taking a cue from Percy, whether it might be a rupture of the “everydayness,” the ordinariness of our lives that may awaken us to the possibility of the search.  My sense at the time was that both were on to something, each was a possibility.  Sports can be merely a distraction conducive to living in bad faith in denial of the truth of our situation.  But at times bursts of grace and beauty appeared suddenly and unexpectedly in the midst of our diversion to remind us that we ought to be searching for their source.  “Through some dim dazzling trick of grace, coming for the one” we receive “the other as God’s own importunate bonus.”

Thinking now about the distractions enabled by the Internet, social media, smart phones, and all the rest I wonder if something like the same analysis might also apply.  Do we embrace these distractions as a way of refusing silence and contemplation because we do not care to entertain the thoughts that may come?  Perhaps.  Surely more than this is going on.  Sometimes a moment of carefree distraction is just that.  Is it possible that coming for distraction we might find something more — a real connection with another human being, a new insight, real wisdom, genuine laughter?

I am not so much of a pessimist that I would discount such possibilities.  But I do fear that more often than not our distractions, as Pascal would put it, are diversions that keep us from considering our true condition. They are part of the “everydayness” of life that is the enemy of the search and might even hide from us the possibility of the search.   To give up on the search, to be unaware of it, is to be in despair. If it doesn’t feel like despair, is it because, as Kierkegaard put it in a line that opens The Moviegoer, “… the specific character of despair is precisely this:  it is unaware of being despair”?

Perhaps it is also because we are too distracted to notice.  We are the “diverted selves” Percy described in Lost in the Cosmos: The Last Self-Help Book,

In a free and affluent society, self is free to divert itself endlessly from itself.  It works in order to enjoy the diversions that the fruit of one’s labor can purchase.  The pursuit of happiness becomes the pursuit of diversion …