Disconnected Varieties of Augmented Experience

In a short blog post, “This is What the Future Looks Like”, former Microsoft executive Linda Stone writes:

“Over the last few weeks, I’ve been noticing that about 1/3 of people walking, crossing streets, or standing on the sidewalk, are ON their cell phones.  In most cases, they are not just talking; they are texting or emailing — attention fully focused on the little screen in front of them.  Tsunami warning?  They’d miss it.

With an iPod, at least as the person listens, they visually attend to where they’re going.  For those walking while texting or sending an email, attention to the world outside of the screen is absent.  The primary intimacy is with the device and it’s possibilities.”

I suspect that you would be able to offer similar anecdotal evidence. I know that this kind of waking somnambulation characterizes a good part of those making their way about the campus of the very large university I attend.

Stone offered these comments by way of introducing a link to a video that you may have seen making its way around the Internet lately, going viral I believe we call it. The video documents Jake P. Reilly’s 90 day experiment in disconnection. He called it The Amish Project and you can watch it unfold here:

Needless to say, Riley was very pleased with what he found on the other side of the connected life. Asked in an interview whether this experience changed his life, Reilly had this to say:

“It’s definitely different, but I catch myself doing exactly what I hated. Someone is talking to me and I’m half-listening and reading a text under the table. For me, it’s trying to be more aware of it. It kind of evolved from being about technology to more of just living in the moment. I think that’s what my biggest thing is: There’s not so much chasing for me now. I’m here now, and let’s just enjoy this. You can be comfortable with yourself and not have to go to the crutch of your phone. For me, that’s more what I will take away from this.”

Although not directly addressing Riley’s experiment, Jason Farman has written a thoughtful piece at The Atlantic that calls into question the link between online connectivity and disconnection from lived experience. In “The Myth of the Disconnected Life”, Farman takes as his foil William Powers’ book, Hamlet’s Blackberry, which I’ve mentioned here a time or two. In his work, Powers commends the practice of taking Digital Sabbaths. If you’ve been reading this blog since its early days, you may remember my own post on technology Sabbaths from 2010; a post, incidentally, which also cited Linda Stone. It’s a healthy practice and one I don’t implement enough in my own experience.

Farman has good things to say about Powers’ work and technology Sabbaths (in fact, his tone throughout is refreshingly irenic):

“For advocates of the Digital Sabbath, the cellphone is the perfect symbol of the always-on lifestyle that leads to disconnection and distraction. It epitomizes the information overload that accompanies being tethered to digital media. Advocates of Digital Sabbaths note that if you are nose-deep in your smartphone, you are not connecting with the people and places around you in a meaningful way.

Ultimately, the Digital Sabbath is a way to fix lifestyles that have prioritized disconnection and distraction and seeks to replace these skewed priorities with sustained attention on the tangible relationships with those around us.”

Nonetheless, he does find the framing of the issue problematic:

“However, using ‘disconnection’ as a reason to disconnect thoroughly simplifies the complex ways we use our devices while simultaneously fetishizing certain ways of gaining depth. Though the proponents of the Digital Sabbath put forth important ideas about taking breaks from the things that often consume our attention, the reasons they offer typically miss some very significant ways in which our mobile devices are actually fostering a deeper sense of connection to people and places.”

Farman then discusses a variety of mobile apps that in his estimation deepen the experience of place for the smartphone equipped individual rather than severing them from physical space. His examples include [murmur]Broadcastr, and an app from the Museum of London. The first two of these apps allow users to record and listen to oral histories of the place they find themselves in and the latter allows users to overlay images of the past over locations throughout London using their smartphones.

In Farman’s view, these kinds of apps provide a deeper experience of place and so trouble the narrative that simplistically opposes digital devices to connection and authentic experience:

“Promoting this kind of deeper context about a place and its community is something these mobile devices are quite good at offering. A person can live in a location for his or her whole life and never be able to know the full history or context of that place; collecting and distributing that knowledge – no matter how banal – is a way to extend our understanding of a place and a gain a deeper connection to its meanings.

Meaning is, after all, found in the practice of a place, in the everyday ways we interact with it and describe it. Currently, that lived practice takes place both in the physical and digital worlds, often through the interface of the smartphone screen.”

Finally, Farman’s concluding paragraph nicely sums up the whole:

“Advocates of the Digital Sabbath have the opportunity to put forth an important message about practices that can transform the pace of everyday life, practices that can offer new perspectives on things taken for granted as well as offering people insights on the social norms that are often disrupted by the intrusion of mobile devices. We absolutely need breaks and distance from our routines to gain a new points of view and hopefully understand why it might come as a shock to your partner when you answer a work call at the dinner table. Yet, by conflating mobile media with a lack of meaningful connection and a distracted mind, they do a disservice to the wide range of ways we use our devices, many of which develop deep and meaningful relationships to the spaces we move through and the people we connect with.”

My instinct usually aligns me with Stone and Powers in these sorts of discussions. Yet, Farman makes a very sensible point. I’m all for recognizing complexity and resisting dichotomies that blind us to important dimensions of experience. And it is true that debates about technology do tend to gloss over the use to which technologies are actually put by the people who actually use them.

All of this calls to mind the work of Michel de Certeau on two counts. First, de Certeau made much of the use to which consumers put products. In his time, the critical focus had fallen on the products and producers; consumers were tacitly assumed to be passive and docile recipients/victims of the powers of production.  De Certeau made it a point, especially in The Practice of Everyday Life, to throw light on the multifarious, and often impertinent, uses to which consumers put products. In many respects, this also reflects the competing approaches of internalists and social constructionists within the history of technology. For the former the logic of the device dominates analysis, for the latter, the uses to which devices are put by users. Farman, likewise, is calling us to be attentive to what some users at least are actually doing with their digital technologies.

De Certeau also had a good deal to say about the practice of place, how we experience places and spaces. Some time ago I wrote about one chapter in particular in The Practice of Everyday Life, “Walking the City”, that explicitly focused on the manner in which memories haunted places. If I may be allowed a little bit of self-plagiarization, let me sum up again the gist of de Certeau’s observations.

Places have a way of absorbing and bearing memories that they then relinquish, bidden or unbidden. The context of walking and moving about spaces leads de Certeau to describe memory as “a sort of anti-museum:  it is not localizable.”  Where museums gather pieces and artifacts in one location, our memories have dispersed themselves across the landscape, they colonize.  Here a memory by that tree, there a  memory in that house.  De Certeau develops a notion of a veiled remembered reality that lies beneath the visible experience of space.

Places are made up of “moving layers.”  We point, de Certeau says, here and there and say things like, “Here, there used to be a bakery” or “That’s where old lady Dupuis used to live.”  We point to a present place only to evoke an absent reality:  “the places people live in are like the presences of diverse absences.”  Only part of what we point to is there physically; but we’re pointing as well to the invisible, to what can’t be seen by anyone else, which begins to hint at a certain loneliness that attends to memory.

Reality is already augmented.  It is freighted with our memories, it comes alive with distant echoes and fleeting images.

Digitally augmented reality functions analogously to what we might call the mentally augmented reality  that de Certeau invokes. Digital augmentation also reminds us that places are haunted by memories of what happened there, sometimes to very few, but often to countless many. The digital tools Farman describes bring to light the hauntedness of places. They unveil the ghosts that linger by this place and that.

For me, the first observation that follows is that by contrast with mental augmentation, digital augmentation, as represented by two of the apps Farman describes, is social. In a sense, it appears to lose the loneliness of memory that de Certeau recognized.

De Certeau elaborates on the loneliness of memory when he approvingly cites the following observation: “‘Memories tie us to that place …. It is personal, not interesting to anyone else …’”  It is like sharing a dream with another person: its vividness and pain or joy can never be recaptured and represented so as to affect another in the same way you were affected.  It is not interesting to anyone else, and so it is with our memories.  Others will listen, they will look were you point, but they cannot see what you see.

I wonder, though, if this is not also the case with the stories collected by apps such as [murmur] and Broadcastr. Social media often seeks to make the private of public consequence, but very often it simply isn’t.  Farman believes that our understanding of a place and a deeper connection to its meanings is achieved by collecting and distributing knowledge of that place, “no matter how banal.” Perhaps it is that last phrase that gives me pause. What counts as banal is certainly subjective, but that is just the point. The seemingly banal may be deeply meaningful to the one who experienced it, but it strikes me as rather generous to believe that the banal that takes on meaning within the context of one’s own experience could be rendered meaningful to others for whom it is banal and also without a place within the narrative of lived experience out of which meaning arises.

The London Museum app seems to me to be of a different sort because it links us back, from what I can gather, to a more distant past or a past that is, in fact, of public consequence. In this case, the banality is overcome by distance in time. What was a banal reality of early twentieth century life, for example, is now foreign and somewhat exotic — it is no longer banal to us.

Wrapped up in this discussion, it seems to me, is the question of how we come to meaningfully experience place — how a space becomes a place, we might say. Mere space becomes a place as its particularities etch themselves into consciousness. As we walk the space again and again and learn to feel our way around it, for example, or as we haunt it with the ghosts of our own experience.

I would not go so far as to say that digital devices necessarily lead to a disconnected or inauthentic experience of place. I would argue, however, that there is a tendency in that direction. The introduction of a digital device does necessarily introduce a phenomenological rupture in our experience of a place. What we do with that device, of course, matters a great deal as Farman rightly insists. But most of what we do does divide our attentiveness and mindfulness, even when it serves to provide information.

Perhaps I am guilty, as Farman puts it, of “fetishizing certain ways of gaining depth.” But I am taken by  de Certeau’s conception of walking as a kind of enunciation that artfully actualizes a multitude of possibilities in much the same way that the act of speaking actualizes the countless possibilities latent in language. Like speaking, then, walking, that is inhabiting a space is a language with its own rhetoric. Like rhetoric proper, the art of being in a place depends upon an acute attentiveness to opportunities offered by the space and a deft, improvised actualization of those possibilities. It is this art of being in a place that constitutes a meaningful and memorable augmentation of reality. Unfortunately, the possibility of unfolding this art is undermined by the manner in which our digital devices ordinarily dissolve and distribute the mindfulness that is its precondition.

Oral Social Literacy, Past and Present

Ivan Illich’s In the Vineyard of the Text is an exploration of the evolution of reading and the book in Western Europe during the 13th century. It is focused on Hugh of St. Victor and a well-known work of his titled the Didascalicon, essentially the first guide to the art of reading.

Illich notes, “Before Hugh’s generation, the book is a record of the author’s speech or dictation. After Hugh, increasingly it becomes a repertory of the author’s thought, a screen onto which one projects still unvoiced intentions.”

Illich early on acknowledges his debt to Walter Ong who synthesized a great deal of research on the cultural consequences of the shift from orality to literacy (and later to what Ong called the secondary orality of electronic media).

What is sometimes lost in this schema is the persistence of orality after the emergence of literacy. And not only in the sense that oral cultures existed alongside literate ones, but also in the persistence of orality within literate societies.

A full 1500 years after literacy was effectively internalized into Western society (which is not the same thing as saying that all of those living in Western society were literate), reading remained a fundamentally oral activity. The quotation from Illich above is drawn from a chapter titled, “Recorded Speech to Record of Thought.” That title nicely captures the degree to which writing was understood as a record of oral communication rather than its own distinct medium prior to the period Illich examined.

Here is Illich again on the orality (and corporeality) of literacy through the late medieval period:

“In a tradition of one and a half millennia, the sounding pages are echoed by the resonance  of the moving lips and tongue.  The reader’s ears pay attention, and strain to catch what the reader’s mouth gives forth.  In this manner the sequence of letters translates directly into body movements and patterns nerve impulses.  The lines are a sound track picked up by the mouth and voiced by the reader for his own ear.  By reading, the page is literally embodied, incorporated.”

And here again on the oral and social nature of reading:

“The monastic reader — chanter or mumbler — picks the words from the lines and creates a public social auditory ambience. All those who, with the reader, are immersed in this hearing milieu are equals before the sound … Fifty years after Hugh, typically, this was no longer true. The technical activity of deciphering no longer creates an auditory and, therefore, a social space. The reader then flips through pages. His eyes mirror the two-dimensional page. Soon he will conceive of his own mind in analogy with a manuscript. Reading will become an individualistic activity, intercourse between a self and a page.”

As I read these passages again today, I was reminded of an essay that appeared not too long ago in the Wall Street Journal. In “Is This the Future of Punctuation?”, Henry Hitchings, the author of The Language Wars: A History of Proper English, makes the following observation about new proposed punctuations marks such as the  interrobang:

“Such marks are symptoms of an increasing tendency to punctuate for rhetorical rather than grammatical effect. Instead of presenting syntactical and logical relationships, punctuation reproduces the patterns of speech.”

The emergence of telephony, radio, and television marked the re-emergence of orality following the era of print literacy’s dominance. Ong called this secondary orality. Having appeared after literacy, it was not identical to primary orality, but it nonetheless represented a reemergence of orality and its habits which would now compete with literacy on the cultural stage.

Within the last twenty years, however, a funny thing has happened on the way to the world of secondary orality. Writing or text has reasserted itself. Text messaging, emails, online reading, e-reading, etc. — all of these together mean that most of us are deciphering of a lot of text each day. Even our television screens, depending on what we are watching, may be chock full of text, scrolling or otherwise.

But this reemergence of text is marked by orality, as the observation by Hitchings suggests. Can we call this secondary literacy? Is it still useful to speak in terms of literacy and orality?

It has always seemed to me that the orality/literacy distinction got at important historical developments in communication and consciousness. The bare dichotomy glossed over a good deal and it was always in need of qualification, but it was serviceable nonetheless. Secondary orality also pointed to important developments. But now what we appear to have, text having reasserted itself, is a thoroughly blended media environment.

It’s chief characteristic is neither its orality nor its literacy. Rather, it is the preponderance of both together — overlapping, interpenetrating, jostling, complementing, conflating.

Interestingly, there has also been, of course, a reemergence of social literacy, but it is not tied to the oral as it was in the circumstances described by Illich above. Rather than an orally constituted literate social anchored to physical presence, we have a diffused literacy based, image-inflected social often untethered from physical presence.

A social space was, then, constituted by an oral performance of the written text and gathered presences. We have, today, spaces constituted as social by silent reading and the presence of absences.

File all of this under “thinking aloud.” (Except, of course, that it wasn’t!)

“I Was Born To Stand Outside Myself”

Last summer I posted a few thoughts on Mark Helprin’s A Soldier of the Great War which I had then begun reading. As it turned out, I set the book aside for a few months for no particular reason, but I’ve recently returned to it and finished it. I stand by my initial characterization: the novel is a meditation on beauty, and a lovely one at that.

Near the end, the main character, Alessandro, meets up with a talented, intelligent, articulate man named Arturo who is nonetheless thoroughly unsuccessful and unaccomplished. In conversation with Alessandro, Arturo sums up his life thus: “I was born to stand outside myself.”

This struck me as a remarkably evocative line. Now I confess that I’m going to take this in a direction that Helprin neither intended nor imagined. That said, the line immediately suggested to me the manner in which we are made to virtually stand beside ourselves in our (social) media environment.

There are perhaps two senses in which this is true. I’m thinking of the way that our social media profiles stand beside us in a rather apprehensible manner. We can look at them, manipulate them, experience them; they are us but not us. In this sense, I’m still inclined to think of our social media profiles as virtual memory theaters, at least in part.

The other sense is the manner in which the presence of our second self impinges upon our lived experience. We not only stand beside our past self as it is represented online, we also stand beside our potential future self as it will be represented online. This other potential self haunts our present from the future. Several months ago I posted a synopsis of Michel de Certeau’s observations about the way the past layers over certain places and haunts them. Places capture memories and those places cannot be divorced from those memories in lived experience. Perhaps if he were alive today, de Certeau would suggest that the future as well as the past now haunts our present. We live with our virtual memory theaters and we live with future memories as well. We are born to stand beside ourselves, actual and potential. And, perhaps most significantly, the potential self we live with is imagined within the constraints of the platform through which it will represented.

On this latter point, I would also point you to Nathan Jurgenson’s fine essay reprinted in The Atlantic: “The Facebook Eye.”

It does occur to me that this standing beside ourselves did not itself emerge with the advent of social media. One could argue that it is a function of all representation. Certainly it is a feature of writing itself. Walter Ong made much of the way in which literacy alienates. Among the many separations effected by writing Ong includes the separation of the known from the knower, the past from the present, and being from time.

So we might conclude that social media only augments a long standing trajectory. But my sense, as it often is with efforts to situate present phenomena within a historical trajectory, is that this threatens to miss the significance of present developments. Changes of degree may amount to changes in kind. As I’ve heard it put somewhere or other, a hurricane is not merely an unusually strong breeze.

Social media, and the technological ecosystem on which it depends, radically augments the alienation of writing if only by its mere ubiquity. But perhaps more importantly, it does so by making our second self that stands beside us also a public self that presents itself to a myriad of others. It’s the difference between an old-school diary and your Facebook profile. It is no small difference and we are all in the process of sorting out the personal and social consequences.

De Certeau concluded that “haunted places are the only ones people can live in.” Walter Ong was also sanguine about the alienations wrought by writing:

“To say writing is artificial is not to condemn it but to praise it . . . By distancing thought, alienating it from its original habitat in sounded words, writing raises consciousness.  Alienation from a natural milieu can be good for us and indeed is in many ways essential for fuller human life.  To live and to understand fully, we need not only proximity but also distance.  This writing provides for, thereby accelerating the evolution of consciousness as nothing else before it does.”

Now the question is whether the standing beside ourselves effected by social media will carry similarly salubrious consequences. Discuss amongst yourselves. Here’s the prompt to get you going: Has social media continued to accelerate the evolution of consciousness or self-consciousness? Is there a difference?

A Chance to Find Yourself

At The American Scholar you can read William Deresiewicz’s lecture to the plebe class of 2009 at West Point. The lecture is titled “Solitude and Leadership” and it makes an eloquent case for the necessity of solitude, and solitary reading in particular, to the would-be leader.

Throughout the lecture, Deresiewicz draws on Joseph Conrad’s The Heart of Darkness and near the end of his talk he cites the following passage. Speaking of an assistant to the manager of the Central Station, Marlow observes:

“I let him run on, this papier-mâché Mephistopheles and it seemed to me that if I tried I could poke my forefinger through him, and would find nothing inside but a little loose dirt. . . .

It was a great comfort to turn from that chap to . . . the battered, twisted, ruined, tin-pot steamboat. . . . I had expended enough hard work on her to make me love her. No influential friend would have served me better. She had given me a chance to come out a bit—to find out what I could do. No, I don’t like work. I had rather laze about and think of all the fine things that can be done. I don’t like work—no man does—but I like what is in the work,—the chance to find yourself. Your own reality—for yourself, not for others—what no other man can ever know.”

Much to think about in those few short lines. “Papier-mâché Mephistopheles” — what a remarkably apt image for what Arendt would later call the banality of evil. It is also worth reflecting on Conrad’s estimation of work in this passage. He evocatively captures part of what I’ve tried to describe in my posts on the discontents of the frictionless life and disposable reality.

It was, however, the line “for yourself, not for others” that struck me with peculiar force. I’ve written here before about the problems with solipsistic or misanthropic individualism. And it should go without saying that, in some important sense, we certainly ought to think and act for others. But I don’t think this is the sort of thing that Conrad had in mind. Perhaps he was driving at some proto-existenialist pour soi. In any case, what came to my mind was the manner in which a life mediated by social media and smart phones is lived “for others”.

Let me try to clarify. The mediated variety of being “for others” is a form of performance and presentation. What we are doing is constructing and offering an image of ourselves for others to consume. The pictures we post, the items we Like, the tweets we retweet, the status updates, the locations we announce on Foursquare, the music we stream, and dare I say it, the blog posts we write — all of these are “for others” and, at least potentially, “for others” without real regard for them. Others, in the worst forms of this dynamic, are merely an audience that can reflect back to us and reinforce our performance of ourselves. In being “for others” in this sense, we risk being “for ourselves” in the worst way.

There is another, less problematic way of being “for others”. At the risk of oversimplifying, let’s call this an unmediated way of being “for others”. This mode of being for others is not self-consciously focused on performance and presentation. This way of being for others does not reduce others to the status of mirrors reflecting our own image back to us. Other are in this case an end, not a means. We lose ourselves in being for others in this way. We do not offer ourselves for consumption, but we are consumed in the work of being for others. The paradox here is that those who are able to lose themselves in this way tend to have a substantial and steady sense of self. Perhaps because they have been “for themselves” in Conrad’s sense, they have nurtured their convictions and character in solitude so that they can be for other in themselves, that is “for others” for the sake of others.

Those who are for others only by way of being for themselves finally end up resembling Conrad’s papier-mâché Mephistopheles, we could poke our fingers through them and find nothing but a little dirt. All is surface.

Altogether, we might conclude that there is an important difference between being for other for the sake of being for yourself and being for yourself for the sake of being for others.

The truth, of course, is that these modes of being “for others” are not new and the former certainly does not owe its existence uniquely to social media. The performed self has roots in the emergence of modernity and this mode of being for others has a family resemblance to flattery which has an even older pedigree. But ubiquitous connectivity and social media do the work of amplifying and generalizing the condition. When their use becomes habitual, when we begin to see the world as potential material for social media, then the space to be for ourselves/by ourselves collapses and we find that we are always being for others for our own sake, preoccupied with the presentation of surfaces.

The consequences of this mode of being are good neither for us, nor for others.

Technology and Trust

The best kind of blog posts and essays are the ones that make you think, and that was exactly the effect of PJ Rey’s smart post on Cyborgology, “Trust and Complex Technology: The Cyborg’s Modern Bargain.” After doing some of that thinking, I offered some on-site comments. Because they are related to previous posts on this site (here, here and here), I’m going to also post them here. Before reading them, I recommend you click over to Rey’s post and read the whole thing which would only take a few minutes.

In case, you’re in a hurry, here’s Rey’s argument in brief, as I understand it: We live in a world of increasingly complex technologies, the workings of which the average person is, for the most part, unable to understand. Because we nonetheless use these technologies, often without so much as a second thought, this means that we are trustingly committed to these technologies and the expert systems that generate them. This trust, which presupposes a lack of self-sufficiency, is a fundamentally social dynamic.

In light of that argument, here are my comments, edited somewhat so as to make better sense as a stand alone statement:

I’m not sure how well this describes the actual existential experience of modern technology. Mostly, I wonder if there is not a distinction between implicit and explicit trust when we talk about the sociability of technology use. In other words, I’m not sure if the users of complex technologies such as the iPhone or social media platforms are consciously deploying what we might meaningfully call “trust” in their use of these technologies and platforms.

We could argue that their use implies a kind of de facto trust in the equipment, but does this amount to the kind of trust that operates in truly social relationships? Trusting systems and trusting persons seem to me to be two different phenomenon. And while it’s possible to argue that people set up and run systems so that finally we’re trusting in them, but I would suggest that the system, if it is running well, practically obscures the human element. So if we’ve outsourced our trust as it were to expert systems, is it still trust that we are talking about or habituated responses to (mostly) predictable systems?

Historians of technology (and Arthur C. Clarke) have pointed to an affinity between magic and technology that can be traced back to the early modern era. I would suggest that our existential experience of using an iPad, for example, more readily approximates our experience of magic than it does trust among social relations. As I write this it occurs to me that perhaps the analysis assumes the other half of the atomized (political) individual that is being criticized, the rational (economic) actor whose decision making is a fully conscious, calculated affair. In other words, I suspect most people are not consciously rationalizing their use of technologies by invoking trust in the persons who stand behind the technologies, rather they use the technology as if it were a magical tool that “just worked” — Apple products, I think, are paradigmatic here. And, if the trust in persons is not explicit, can it meaningfully be called social?

One last thought. Philosopher Albert Borgmann’s “device paradigm” comes to mind here. In brief, he argues that our tools are increasingly offering ease of use, effectiveness, and efficiency but at the expense of becoming increasingly opaque to the average user — much the same dynamic that you describe. But this opacity, in his view, has the effect of alienating us from the technology, and to some degree from the experiences mediated by those technologies. In part, I suspect, this is precisely because we are made to use that which we don’t even remotely understand, and this is not really trust, but a kind of gamble. Trust, in social relations, is not quite blind faith; it is risky, but not, in most cases, a gamble. Coming back to the airplane, for many people, statistics not withstanding, boarding an airplane feels like a gamble and involves very little trust at all.

To wrap up, I’m suggesting that we need to phenomenologically parse out “trust”: reasoned, explicit trust; reasonable, implicit trust; unreasonable, implicit trust; unreasonable, explicit trust (blind faith); explicitly untrusting acquiescence; gambles; etc. Our use of technology relies on many of these, but not all of them amount to the kind of trust that might render us meaningfully social.

There you have it; comments, as always, welcome.

Update: The conversation continues at Cyborgology. Click through to read Rey’s response, etc.