“You know, like when you realize you left your phone at home …”

The discipline of anthropology cut its teeth on the study of cultures that were deemed “primitive” and exotic by the standards of nineteenth century Western, industrialized society. North American and European nations were themselves undergoing tremendous transformations wrought by the advent of groundbreaking new technologies — the steam engine, railroad, and telegraph, to name just three. These three alone dramatically reordered the realms of industry, transportation, and communication. Altogether they had the effect of ratcheting up the perceived pace of cultural evolution. Meanwhile, the anthropologists studied societies in which change, when it could be perceived, appeared to proceed at a glacial pace. Age-old ritual and tradition structured the practice of everyday life and a widely known body of stories ordered belief and behavior.

“All that is solid melts into air, and all that is holy is profaned …” — so wrote Marx and Engels in 1848. The line evocatively captures the erosive consequences of modernity. The structures of traditional society then recently made an object of study by the anthropologists were simultaneously passing out of existence in the “modern” world.

I draw this contrast to point out that our own experience of rapid and disorienting change has a history. However out of sorts we may feel as we pass through what may be justly called the digital revolution, it probably does not quite compare with the sense of displacement engendered by the technological revolutions of the late nineteenth and early twentieth century. I still tend to think that the passage from no electricity to near ubiquitous electrification is more transformative than the passage from no Internet to ubiquitous Internet. (But I could be persuaded otherwise.)

So when, in “You’ve Already Forgotten Yesterday’s Internet,” Philip Bump notes that the Internet is “a stunningly effective accelerant” that has rendered knowledge a “blur,” he is identifying the present position and velocity of a trajectory set in motion long ago. Of course, if Bump is right, and I think he is certainly in the ball park so far as his diagnosis is concerned, then this history is irrelevant since no one really remembers it anyway, at least not for long.

Bump begins his brief post by making a joke out of the suggestion that he was going to talk about Herodotus. Who talks about Herodotus? Who even knows who Herodotus was? The joke may ultimately be on us, but Bump is right. The stories that populated the Western imagination for centuries have been largely forgotten. Indeed, as Bump suggests, we can barely keep the last several months in mind, much less the distant past:

“The web creates new shared points of reference every hour, every minute. The growth is exponential, staggering. Online conversation has made reference to things before World War II exotic — and World War II only makes the cut because of Hitler.

Yesterday morning, an advisor to Mitt Romney made a comment about the Etch-A-Sketch. By mid-afternoon, both of his rivals spoke before audiences with an Etch-A-Sketch in hand. The Democratic National Committee had an ad on the topic the same day. The point of reference was born, spread — and became trite — within hours.”

Bump’s piece is itself over a week old, and I’m probably committing some sort of blogging sin by commenting on it at this point. But I’ll risk offending the digital gods of time and forgetting because he’s neatly captured the feel of Internet culture. But this brings us back to the origins of anthropology and the very idea of culture. Whatever we might mean by culture now, it has very little to do with the structures of traditional, “solid” societies that first filled the term with meaning. Our culture, however we might define it, is no longer characterized by the persistence of the past into the present.

I should clarify: our culture is no longer characterized by the acknowledged, normative persistence of the past into the present. By this clarification I’m trying to distinguish between the sense in which the past persists whether we know it or like it, and the sense in which the past persists because it is intentionally brought to bear on the present. The persistence of the past in the former sense is, as far as I can tell, an unavoidable feature of our being time-bound creatures. The latter, however, is a contingent condition that obtained in pre-modern societies to a greater degree, but no longer characterizes modern (or post-modern, if you prefer) society to the same extent.

Notably, our culture no longer trades on a stock of shared stories about the past. Instead (beware, massive generalizations ahead) we moved into a cultural economy of shared experience. Actually, that’s not quite right either. It’s not so much shared experience as it is a shared existential sensibility — affect.

I am reminded of David Foster Wallace’s comments on what literature can do:

“There’s maybe thirteen things, of which who even knows which ones we can talk about.  But one of them has to do with the sense of, the sense of capturing, capturing what the world feels like to us, in the sort of way that I think that a reader can tell ‘Another sensibility like mine exists.’  Something else feels this way to someone else.  So that the reader feels less lonely.”

Wallace goes on to describe the work of avant-garde or experimental literature as “the stuff that’s about what it feels like to live.  Instead of being a relief from what it feels like to live.”

When the objective content of culture, the stories for example, are marginalized for whatever myriad reasons, there still remains the existential level of lived experience which then becomes the object of analysis and comment. Talk about “what it feels like to be alive” now does the work shared stories accomplished in older cultural configurations. We’re all meta now because our focus has shifted to our own experience.

Consider the JFK assassination as a point of transition. It may be the first event about which people began to ask and talk about where they were when the event transpired. The story becomes about where I was when I heard the news. This is an indicator of a profound cultural shift. The event itself fades into the background as the personal experience of the event moves forward. The objectivity of the event becomes less important than the subjective experience. Perhaps this exemplifies a general societal trend. We may not exchange classical or biblical allusions in our everyday talk, but we can trade accounts of our anxiety and nostalgia that will ring broadly true to others.

We don’t all know the same stories, but we know what it feels like to be alive in a time when information washes over us indiscriminately. The story we share is now about how we can’t believe this or that event is already a year past. If feels as if it were just yesterday, or it feels as if it was much longer ago. In either case, what we feel is that we don’t have a grip on the passage of time or the events carried on the flood. Or we share stories about the anxiety that gripped us when we realized we had left our phone at home. This story resonates. That experience becomes our new form of allusion. It is not an allusion to literature or history, it is an allusion to shared existential angst.

The Internet & the Youth of Tomorrow: Highlights from the Pew Survey

The Pew Internet & American Life Project conducted “an opt-in, online survey of a diverse but non-random sample of 1,021 technology stakeholders and critics … between August 28 and October 31, 2011.” The survey presented two scenarios for the youth of 2020, asked participants to choose which they thought more likely, and then invited elaboration.

Here are the two scenarios and the responses they garnered:

Some 55% agreed with the statement:

In 2020 the brains of multitasking teens and young adults are “wired” differently from those over age 35 and overall it yields helpful results. They do not suffer notable cognitive shortcomings as they multitask and cycle quickly through personal- and work- related tasks. Rather, they are learning more and they are more adept at finding answers to deep questions, in part because they can search effectively and access collective intelligence via the internet. In sum, the changes in learning behavior and cognition among the young generally produce positive outcomes.

Some 42% agreed with the opposite statement, which posited:

In 2020, the brains of multitasking teens and young adults are “wired” differently from those over age 35 and overall it yields baleful results. They do not retain information; they spend most of their energy sharing short social messages, being entertained, and being distracted away from deep engagement with people and knowledge. They lack deep-thinking capabilities; they lack face-to-face social skills; they depend in unhealthy ways on the internet and mobile devices to function. In sum, the changes in behavior and cognition among the young are generally negative outcomes.

However the report also noted the following:

While 55% agreed with the statement that the future for the hyperconnected will generally be positive, many who chose that view noted that it is more their hope than their best guess, and a number of people said the true outcome will be a combination of both scenarios.

In all honesty, I am somewhat surprised the results split so evenly. I would have expected the more positive scenario to perform better than it did. The most interesting aspect of the report, however, are of course the excerpts presented from the respondents’ elaborations. Here are few with some interspersed commentary.

A number of respondents wrote about the skills that will be valued in the emerging information ecosystem:

  • There are concerns about new social divides. “I suspect we’re going to see an increased class division around labor and skills and attention,” said media scholar danah boyd.
  • “The essential skills will be those of rapidly searching, browsing, assessing quality, and synthesizing the vast quantities of information,” wrote Jonathan Grudin, principal researcher at Microsoft. “In contrast, the ability to read one thing and think hard about it for hours will not be of no consequence, but it will be of far less consequence for most people.”

Among the more interesting excerpts was this from Amber Case, cyberanthropologist and CEO of Geoloqi:

  • “The human brain is wired to adapt to what the environment around it requires for survival. Today and in the future it will not be as important to internalize information but to elastically be able to take multiple sources of information in, synthesize them, and make rapid decisions … Memories are becoming hyperlinks to information triggered by keywords and URLs. We are becoming ‘persistent paleontologists’ of our own external memories, as our brains are storing the keywords to get back to those memories and not the full memories themselves.”
I’m still not convinced at all by the argument against internalization. (You can read why here, here, and here.) But she is certainly correct about our “becoming ‘persistent paleontologists’ of our own external memory.” And the point was memorably put as well. We are building vast repositories of external memory and revisiting those stores in ways that are historically novel. We’ve yet to register the long term consequences.

The notion of our adaptability to new information environments was also raised frequently:

  • Cathy Cavanaugh, an associate professor of educational technology at the University of Florida, noted, “Throughout human history, human brains have elastically responded to changes in environments, society, and technology by ‘rewiring’ themselves. This is an evolutionary advantage and a way that human brains are suited to function.”

This may be true enough, but what is missing from these sorts of statements is any discussion of which environments might be better or worse for human beings. To acknowledge that we adapt is to say nothing about whether or not we ought to adapt. Or, if one insists, Borg-like, that we must adapt or die, there is little discussion about whether this adaptation leaves us on the whole better off. In other words, we ought to be asking whether the environment we are asked to adapt to is more or less conducive to human flourishing. If it is not, then all the talk of adaptation is a thinly veiled fatalism.

Some, however, did make strong and enthusiastic claims for the beneficence of the emerging media environment:

  • “The youth of 2020 will enjoy cognitive ability far beyond our estimates today based not only on their ability to embrace ADHD as a tool but also by their ability to share immediately any information with colleagues/friends and/or family, selectively and rapidly. Technology by 2020 will enable the youth to ignore political limitations, including country borders, and especially ignore time and distance as an inhibitor to communications. There will be heads-up displays in automobiles, electronic executive assistants, and cloud-based services they can access worldwide simply by walking near a portal and engaging with the required method such as an encrypted proximity reader (surely it will not be a keyboard). With or without devices on them, they will communicate with ease, waxing philosophic and joking in the same sentence. I have already seen youths of today between 20 and 35 who show all of these abilities, all driven by and/or enabled by the internet and the services/technologies that are collectively tied to and by it.”

This was one of the more techno-uptopian predictions in the survey. The notion of “embracing ADHD as a tool” is itself sufficiently jarring to catch one’s attention. One gets the gist of what the respondent is foreseeing — a society in which cognitive values have been radically re-ordered. Where sustained attention is no longer prized, attention deficit begins to seem like a boon. The claims about the irrelevance of geographic and temporal limits are particularly interesting (or disconcerting). They seemingly make a virtue of disembodied rootlessness. The youth of the future will, in this scenario, be temporally and spatially homeless, virtually dispersed. (The material environment of the future imagined here also invites comparison to the dystopian vision of the film Wall-E.)

Needless to say, not all respondents were nearly so sanguine. Most interestingly, many of the youngest respondents were among the most concerned:

  • A number of the survey respondents who are young people in the under-35 age group—the central focus of this research question—shared concerns about changes in human attention and depth of discourse among those who spend most or all of their waking hours under the influence of hyper connectivity.

This resonates with my experience teaching as well. There’s a palpable unease among many of the most connected with the pace, structure, and psychic consequences of the always on life. They appear to be discovering through experience what is eloquently put by Annette Liska:

  • Annette Liska, an emerging-technologies design expert, observed, “The idea that rapidity is a panacea for improved cognitive, behavioral, and social function is in direct conflict with topical movements that believe time serves as a critical ingredient in the ability to adapt, collaborate, create, gain perspective, and many other necessary (and desirable) qualities of life. Areas focusing on ‘sustainability’ make a strong case in point: slow food, traditional gardening, hands- on mechanical and artistic pursuits, environmental politics, those who eschew Facebook in favor of rich, active social networks in the ‘real’ world.”

One final excerpt:

  • Martin D. Owens, an attorney and author of Internet Gaming Law, Just as with J.R.R. Tolkien’s ring of power, the internet grants power to the individual according to that individual’s wisdom and moral stature. Idiots are free to do idiotic things with it; the wise are free to acquire more wisdom. It was ever thus.

In fact, the ring in Tolkien’s novels is a wholly corrupting force. The “wisdom and moral stature” of the wearer may only forestall the deleterious effects. The most wise avoided using it at all. I won’t go so far as to suggest that the same applies to the Internet, but I certainly couldn’t let Tolkien be appropriated in the service of misguided view of technological neutrality.

Flânerie and the Dérive, Online and Off

Had I been really ambitious with yesterday’s post, I would have attempted to draw in a little controversy over the practice of flânerie earlier this week. You may be wondering what flânerie is, or if you know what it is, you may be wondering why on earth it would spark controversy.

Short answers today. First, flânerie is the practice of being a flâneur. The flâneur is a figure from the city streets of nineteenth century Paris who made his way through the crowded avenues and arcades watching and observing, engaging in what we could call peripatetic social criticism. He was popularized in the work of the poet Charles Baudelaire and the literary critic Walter Benjamin.

The recent mini-controversy revolved around the practice of cyber-flânerie, playing the part of the flâneur online. The debate was kicked off by Evgeny Morozov’s “The Death of the Cyberflâneur” in the NY Times. Morozov drew a spirited response from Dana Goldstein in a blog post, “Flânerie Lives! On Facebook, Sex, and the Cybercity”.

Morozov is a champion of the open, chaotic web of the early Internet. He is a fierce critic of the neat, closed web of Google, Facebook, and apps. Here is the gist of his argument (although, as always, I encourage you to read the whole piece):

“It’s easy to see, then, why cyberflânerie seemed such an appealing notion in the early days of the Web. The idea of exploring cyberspace as virgin territory, not yet colonized by governments and corporations, was romantic; that romanticism was even reflected in the names of early browsers (“Internet Explorer,” “Netscape Navigator”).

Online communities like GeoCities and Tripod were the true digital arcades of that period, trading in the most obscure and the most peculiar, without any sort of hierarchy ranking them by popularity or commercial value. Back then eBay was weirder than most flea markets; strolling through its virtual stands was far more pleasurable than buying any of the items. For a brief moment in the mid-1990s, it did seem that the Internet might trigger an unexpected renaissance of flânerie.

….

Something similar has happened to the Internet. Transcending its original playful identity, it’s no longer a place for strolling — it’s a place for getting things done. Hardly anyone “surfs” the Web anymore. The popularity of the “app paradigm,” whereby dedicated mobile and tablet applications help us accomplish what we want without ever opening the browser or visiting the rest of the Internet, has made cyberflânerie less likely. That so much of today’s online activity revolves around shopping — for virtual presents, for virtual pets, for virtual presents for virtual pets — hasn’t helped either. Strolling through Groupon isn’t as much fun as strolling through an arcade, online or off.”

Further on, Morozov lays much of the blame on Facebook and the quest for frictionless sharing. All told it seems to me that his critique centers on the loss of anonymity and the lack of serendipity that characterize the Facebook model web experience.

For her part, Goldstien, who has done graduate work on nineteenth century flânerie, questioned whether anonymity was essential to the practice. Morozov cited Zygmunt Bauman’s line, “The art that the flâneur masters is that of seeing without being caught looking.” Goldstein on the other hand writes, “The historian Della Pollock, contra Morozov and Bauman, describes flânerie as ‘observing well and…being well worth observing’ in turn.”

Yet, even if we did grant Bauman’s characterization of flânerie, Goldstein still believes that much of what we do online qualifies as such:

“Seeing without being caught looking. Is there any better description for so much of what we do online? Admit it: You’re well acquainted with your significant other’s ex’s Facebook page. You’ve dived deep into the search results for the name of the person you’re dating, the job applicant you’re interviewing, the prospective tenant or roommate. On the dating site OkCupid, you can even pay for the privilege of “enhanced anonymous browsing,” in which you can see who checks out your profile, but no one can see which profiles you’ve looked at yourself. On Facebook, one of the most common spam bots promises to reveal who’s been looking at your profile. It’s so tempting! People click and the spam spreads, but it’s a trick: Facebook conceals users’ browsing histories from one another.”

I should say that I have no particular expertise in the study of the flâneur beyond a little reading of Benjamnin here and there. My hunch, however, is that the tradition of flânerie is wide enough to accommodate the readings of both Morozov and Goldstein. Moreover, I suspect that perhaps a metaphor has been unhelpfully reified. After all, cyber-space (who knew we would be using “cyber” again) and physical space are only metaphorically analogous. That metaphor is suggestive and has generated insightful analysis, but it is a metaphor that can be pushed too far. Cyberflânerie and, what shall we call it, brick-and-mortar flânerie — these two are also only metaphorically analogous. Again, it is a rich and suggestive metaphor, but it will have its limits. Phenomenologically, clicking on a link is not quite the same thing as turning a corner. The way each presents itself to us and acts on us is quite different.

Additionally, to complicate matters, it might also be interesting to draw into the conversation related practices such the dérive popularized by Guy Debord and the Situationists. That, however, will remain only a passing observation, except to note that intention is of great consequence. Why are we engaging in this practice or that? That seems to me to be the question. Stalking, flânerie, and the dérive may have structural similarities (particularly to the outside observer, the one watching the watcher as it were, but not knowing why the watcher watches), but they are distinguished decisively by their intent. Likewise, online analogs should be distinguished according to their intent.

Although, having just written that, it occurs to me that the dérive analogy does have an interesting dimension to offer. Regardless of our intentions, when we go online we do often find ourselves very far afield from whatever our initial reason for going online might have been. This is something the dérive assumes as an integral part of the practice. Also, while varieties of flânerie involve acting to see and to be seen in debatable portions, the dérive, in analogy with psychotherapy, is less focused on the seeing and being seen altogether. Here is Debord on the dérive:

“One of the basic situationist practices is the dérive, a technique of rapid passage through varied ambiences. Dérives involve playful-constructive behavior and awareness of psychogeographical effects, and are thus quite different from the classic notions of journey or stroll.

“In a dérive one or more persons during a certain period drop their relations, their work and leisure activities, and all their other usual motives for movement and action, and let themselves be drawn by the attractions of the terrain and the encounters they find there. Chance is a less important factor in this activity than one might think: from a dérive point of view cities have psychogeographical contours, with constant currents, fixed points and vortexes that strongly discourage entry into or exit from certain zones.”

This practice seems to better fit (metaphorically still) the online experience of either the early web or its more recent iteration. The difference, it would seem, lies in the object of study. Flânerie oscillates between social criticism and performance. The dérive takes one’s own psychic state as the object of study insofar as it is revealed by the manner in which we negotiate space, online and off.

Disconnected Varieties of Augmented Experience

In a short blog post, “This is What the Future Looks Like”, former Microsoft executive Linda Stone writes:

“Over the last few weeks, I’ve been noticing that about 1/3 of people walking, crossing streets, or standing on the sidewalk, are ON their cell phones.  In most cases, they are not just talking; they are texting or emailing — attention fully focused on the little screen in front of them.  Tsunami warning?  They’d miss it.

With an iPod, at least as the person listens, they visually attend to where they’re going.  For those walking while texting or sending an email, attention to the world outside of the screen is absent.  The primary intimacy is with the device and it’s possibilities.”

I suspect that you would be able to offer similar anecdotal evidence. I know that this kind of waking somnambulation characterizes a good part of those making their way about the campus of the very large university I attend.

Stone offered these comments by way of introducing a link to a video that you may have seen making its way around the Internet lately, going viral I believe we call it. The video documents Jake P. Reilly’s 90 day experiment in disconnection. He called it The Amish Project and you can watch it unfold here:

Needless to say, Riley was very pleased with what he found on the other side of the connected life. Asked in an interview whether this experience changed his life, Reilly had this to say:

“It’s definitely different, but I catch myself doing exactly what I hated. Someone is talking to me and I’m half-listening and reading a text under the table. For me, it’s trying to be more aware of it. It kind of evolved from being about technology to more of just living in the moment. I think that’s what my biggest thing is: There’s not so much chasing for me now. I’m here now, and let’s just enjoy this. You can be comfortable with yourself and not have to go to the crutch of your phone. For me, that’s more what I will take away from this.”

Although not directly addressing Riley’s experiment, Jason Farman has written a thoughtful piece at The Atlantic that calls into question the link between online connectivity and disconnection from lived experience. In “The Myth of the Disconnected Life”, Farman takes as his foil William Powers’ book, Hamlet’s Blackberry, which I’ve mentioned here a time or two. In his work, Powers commends the practice of taking Digital Sabbaths. If you’ve been reading this blog since its early days, you may remember my own post on technology Sabbaths from 2010; a post, incidentally, which also cited Linda Stone. It’s a healthy practice and one I don’t implement enough in my own experience.

Farman has good things to say about Powers’ work and technology Sabbaths (in fact, his tone throughout is refreshingly irenic):

“For advocates of the Digital Sabbath, the cellphone is the perfect symbol of the always-on lifestyle that leads to disconnection and distraction. It epitomizes the information overload that accompanies being tethered to digital media. Advocates of Digital Sabbaths note that if you are nose-deep in your smartphone, you are not connecting with the people and places around you in a meaningful way.

Ultimately, the Digital Sabbath is a way to fix lifestyles that have prioritized disconnection and distraction and seeks to replace these skewed priorities with sustained attention on the tangible relationships with those around us.”

Nonetheless, he does find the framing of the issue problematic:

“However, using ‘disconnection’ as a reason to disconnect thoroughly simplifies the complex ways we use our devices while simultaneously fetishizing certain ways of gaining depth. Though the proponents of the Digital Sabbath put forth important ideas about taking breaks from the things that often consume our attention, the reasons they offer typically miss some very significant ways in which our mobile devices are actually fostering a deeper sense of connection to people and places.”

Farman then discusses a variety of mobile apps that in his estimation deepen the experience of place for the smartphone equipped individual rather than severing them from physical space. His examples include [murmur]Broadcastr, and an app from the Museum of London. The first two of these apps allow users to record and listen to oral histories of the place they find themselves in and the latter allows users to overlay images of the past over locations throughout London using their smartphones.

In Farman’s view, these kinds of apps provide a deeper experience of place and so trouble the narrative that simplistically opposes digital devices to connection and authentic experience:

“Promoting this kind of deeper context about a place and its community is something these mobile devices are quite good at offering. A person can live in a location for his or her whole life and never be able to know the full history or context of that place; collecting and distributing that knowledge – no matter how banal – is a way to extend our understanding of a place and a gain a deeper connection to its meanings.

Meaning is, after all, found in the practice of a place, in the everyday ways we interact with it and describe it. Currently, that lived practice takes place both in the physical and digital worlds, often through the interface of the smartphone screen.”

Finally, Farman’s concluding paragraph nicely sums up the whole:

“Advocates of the Digital Sabbath have the opportunity to put forth an important message about practices that can transform the pace of everyday life, practices that can offer new perspectives on things taken for granted as well as offering people insights on the social norms that are often disrupted by the intrusion of mobile devices. We absolutely need breaks and distance from our routines to gain a new points of view and hopefully understand why it might come as a shock to your partner when you answer a work call at the dinner table. Yet, by conflating mobile media with a lack of meaningful connection and a distracted mind, they do a disservice to the wide range of ways we use our devices, many of which develop deep and meaningful relationships to the spaces we move through and the people we connect with.”

My instinct usually aligns me with Stone and Powers in these sorts of discussions. Yet, Farman makes a very sensible point. I’m all for recognizing complexity and resisting dichotomies that blind us to important dimensions of experience. And it is true that debates about technology do tend to gloss over the use to which technologies are actually put by the people who actually use them.

All of this calls to mind the work of Michel de Certeau on two counts. First, de Certeau made much of the use to which consumers put products. In his time, the critical focus had fallen on the products and producers; consumers were tacitly assumed to be passive and docile recipients/victims of the powers of production.  De Certeau made it a point, especially in The Practice of Everyday Life, to throw light on the multifarious, and often impertinent, uses to which consumers put products. In many respects, this also reflects the competing approaches of internalists and social constructionists within the history of technology. For the former the logic of the device dominates analysis, for the latter, the uses to which devices are put by users. Farman, likewise, is calling us to be attentive to what some users at least are actually doing with their digital technologies.

De Certeau also had a good deal to say about the practice of place, how we experience places and spaces. Some time ago I wrote about one chapter in particular in The Practice of Everyday Life, “Walking the City”, that explicitly focused on the manner in which memories haunted places. If I may be allowed a little bit of self-plagiarization, let me sum up again the gist of de Certeau’s observations.

Places have a way of absorbing and bearing memories that they then relinquish, bidden or unbidden. The context of walking and moving about spaces leads de Certeau to describe memory as “a sort of anti-museum:  it is not localizable.”  Where museums gather pieces and artifacts in one location, our memories have dispersed themselves across the landscape, they colonize.  Here a memory by that tree, there a  memory in that house.  De Certeau develops a notion of a veiled remembered reality that lies beneath the visible experience of space.

Places are made up of “moving layers.”  We point, de Certeau says, here and there and say things like, “Here, there used to be a bakery” or “That’s where old lady Dupuis used to live.”  We point to a present place only to evoke an absent reality:  “the places people live in are like the presences of diverse absences.”  Only part of what we point to is there physically; but we’re pointing as well to the invisible, to what can’t be seen by anyone else, which begins to hint at a certain loneliness that attends to memory.

Reality is already augmented.  It is freighted with our memories, it comes alive with distant echoes and fleeting images.

Digitally augmented reality functions analogously to what we might call the mentally augmented reality  that de Certeau invokes. Digital augmentation also reminds us that places are haunted by memories of what happened there, sometimes to very few, but often to countless many. The digital tools Farman describes bring to light the hauntedness of places. They unveil the ghosts that linger by this place and that.

For me, the first observation that follows is that by contrast with mental augmentation, digital augmentation, as represented by two of the apps Farman describes, is social. In a sense, it appears to lose the loneliness of memory that de Certeau recognized.

De Certeau elaborates on the loneliness of memory when he approvingly cites the following observation: “‘Memories tie us to that place …. It is personal, not interesting to anyone else …’”  It is like sharing a dream with another person: its vividness and pain or joy can never be recaptured and represented so as to affect another in the same way you were affected.  It is not interesting to anyone else, and so it is with our memories.  Others will listen, they will look were you point, but they cannot see what you see.

I wonder, though, if this is not also the case with the stories collected by apps such as [murmur] and Broadcastr. Social media often seeks to make the private of public consequence, but very often it simply isn’t.  Farman believes that our understanding of a place and a deeper connection to its meanings is achieved by collecting and distributing knowledge of that place, “no matter how banal.” Perhaps it is that last phrase that gives me pause. What counts as banal is certainly subjective, but that is just the point. The seemingly banal may be deeply meaningful to the one who experienced it, but it strikes me as rather generous to believe that the banal that takes on meaning within the context of one’s own experience could be rendered meaningful to others for whom it is banal and also without a place within the narrative of lived experience out of which meaning arises.

The London Museum app seems to me to be of a different sort because it links us back, from what I can gather, to a more distant past or a past that is, in fact, of public consequence. In this case, the banality is overcome by distance in time. What was a banal reality of early twentieth century life, for example, is now foreign and somewhat exotic — it is no longer banal to us.

Wrapped up in this discussion, it seems to me, is the question of how we come to meaningfully experience place — how a space becomes a place, we might say. Mere space becomes a place as its particularities etch themselves into consciousness. As we walk the space again and again and learn to feel our way around it, for example, or as we haunt it with the ghosts of our own experience.

I would not go so far as to say that digital devices necessarily lead to a disconnected or inauthentic experience of place. I would argue, however, that there is a tendency in that direction. The introduction of a digital device does necessarily introduce a phenomenological rupture in our experience of a place. What we do with that device, of course, matters a great deal as Farman rightly insists. But most of what we do does divide our attentiveness and mindfulness, even when it serves to provide information.

Perhaps I am guilty, as Farman puts it, of “fetishizing certain ways of gaining depth.” But I am taken by  de Certeau’s conception of walking as a kind of enunciation that artfully actualizes a multitude of possibilities in much the same way that the act of speaking actualizes the countless possibilities latent in language. Like speaking, then, walking, that is inhabiting a space is a language with its own rhetoric. Like rhetoric proper, the art of being in a place depends upon an acute attentiveness to opportunities offered by the space and a deft, improvised actualization of those possibilities. It is this art of being in a place that constitutes a meaningful and memorable augmentation of reality. Unfortunately, the possibility of unfolding this art is undermined by the manner in which our digital devices ordinarily dissolve and distribute the mindfulness that is its precondition.

10 Things Our Kids Will Never Experience Thanks to the Information Revolution

Early in the life of this blog, I linked to a very useful post by Adam Thierer at Technology Liberation Front mapping out a spectrum of attitudes to the Internet ranging from optimism to pessimism with a pragmatic middle in between. The post helpfully positioned a wide variety of contemporary writers and summarized their positions on the social consequences of the Internet. It remains a great starting point for anyone wanting to get their bearings on the public debate over the Internet and its consequences.

Adam subsequently included a link to my post on technology Sabbaths in his list of resources for further reading and he has since then, and on more than one occasion, been generous enough to kindly mention this blog via Twitter. He’s now writing regularly at Forbes and offers excellent commentary on the legal and regulatory issues related to Internet policy.

On the aforementioned spectrum, I believe that Adam positioned himself on the optimistic side of the pragmatic middle. I’ve generally been content to occupy the  more pessimistic side. Precisely because of this propensity, I make a point of reading folks like Adam for balance and perspective. I was not surprised then to read Adam’s recent upbeat article, “10 Things Our Kids Will Never Worry About Thanks to the Information Revolution.”

I trust he won’t mind if I offer a view from just the other side of the pragmatic middle. This will work best if you’ve read his post; so click through, give it a read, and then come back to consider my take below.

So here are my more contrarian/pessimistic assessments. The bold faced numbered items are Adam’s list of things kids will never worry about thanks to digital technology. My thoughts are below each.

1. Taking a typing class.

I took a typing class in ninth grade and as much as I disliked it at the time, I’m extremely grateful for it now. It made college and grad school much less arduous, and has served me well given the countless professional uses of typing (on a computer, of course). Kids may figure out a rough and ready method of typing on their own, but in my experience, this is not nearly as efficient as mastering traditional typing skills. Unless the PC vanishes, expert typing skills will remain an advantage.

2. Paying bills by writing countless checks.

I too write very few checks and have been using online bill pay for years now. But here’s what’s lost: the sense of money as a limited resource that derives from both the use of cash on hand and the arithmetical practice of balancing a check book.

3. Buying an expensive set of encyclopedias

I remember rather fondly when the encyclopedia set arrived at my house; over the years I spent quite some time with it. Yes, I was that kid. Never mind that, this is a point that easily segues into the larger debate about digital media and print. Too much to reduce to a brief note; suffice it to say that digital texts have not exactly been linked to a renaissance of secondary education. That price tag for the set was a bit stiff though, probably why I got a World Book set instead of the gold standard Britannica.

4. Using a pay phone or racking up a big “long distance” bill

No argument on the big bill, but consider that what has been lost here is the salubrious social instinct that conceived of the enclosed phone booth in the first place.

5. Having to pay someone else to develop photographs

Hard to argue against having to pay less, but consider the psychic consequences of the digital camera. We’re obsessive self-documenters now and have never met a scene that wasn’t a picture waiting to happen. And when was the last time you actually printed out digital photos anyway. Interestingly, vintage photographic equipment is making a comeback in some circles.

6. Driving to the store to rent a movie

Gone with it are the little rites of passage that children enjoy like being allowed to walk or ride the bike to the video store for the first time and the subsequent little adventures that such journeys could bring. Of course, it’s not just about the video store. But the trajectory here suggests that we’ll not need to leave our house for much. I think it was Jane Jacobs who noted the necessary socializing influence of the countless personal and face-to-face micro-encounters attending life in the city. Suburbs diminished their number; the convenience of the Internet has reduced them even further.

7. Buying / storing music, movies, or games on physical media

These same objects also functioned as depositories of memories … when they have their own unique, tangible form. Lost also is the art of giving as a gift the perfect album or film that fits your friend and their circumstances just right. No need, iTunes gift card will do.

8. Having to endlessly search to find unique content

Adam tells us exactly what is lost: “I will never forget the day in the early 1980s when, after a long search, I finally found a rare Led Zeppelin B-Side (“Hey Hey What Can I Do”) on a “45” in a dusty bin at a small record store. It was like winning the lottery!”

9. Sending letters

Guess I’m a nostalgic old-timer. But seriously, tell me who wouldn’t get a thrill from receiving a letter from a friend. Lost is the expectation and the joy of waiting for and receiving a letter. Lost too is the relationship to time entailed by the practice of letter writing and the patience it cultivated.

10. Being without the Internet & instant, ubiquitous connectivity

Lost is solitude, introspection, uninterrupted time with others. But in fairness this does bring that unique blend of anxiety and obsessiveness that the expectation of being able to instantly communicate engenders when for whatever reason it is not immediately successful.

Admittedly, this is hardly intended to be a rigorous sociological analysis of digital culture. The “Never” in the title is hyperbolic, of course. Many of these losses are not total and they are balanced by certain gains. But as I wrote my more pessimistic rejoinders, I did notice a pattern: the tendency to collapse the distance between desire and its fulfillment. We do this either by reducing the distance in time or else the distance in effort. Make something effortless and instant and you simultaneously make it ephemeral and trivial. The consequence is the diminishment of the satisfaction and joy that attends the fulfillment.

If this is true and this pattern holds, then what our kids may never know, or at least know less of thanks to the information revolution is abiding and memorable joy and the satisfactions that attend delayed gratification and effort expended toward a desired end.

_______________________________________________________

Be sure to read Adam’s response, “Information Revolutions and Cultural/Economic Tradeoffs”