“You know, like when you realize you left your phone at home …”

The discipline of anthropology cut its teeth on the study of cultures that were deemed “primitive” and exotic by the standards of nineteenth century Western, industrialized society. North American and European nations were themselves undergoing tremendous transformations wrought by the advent of groundbreaking new technologies — the steam engine, railroad, and telegraph, to name just three. These three alone dramatically reordered the realms of industry, transportation, and communication. Altogether they had the effect of ratcheting up the perceived pace of cultural evolution. Meanwhile, the anthropologists studied societies in which change, when it could be perceived, appeared to proceed at a glacial pace. Age-old ritual and tradition structured the practice of everyday life and a widely known body of stories ordered belief and behavior.

“All that is solid melts into air, and all that is holy is profaned …” — so wrote Marx and Engels in 1848. The line evocatively captures the erosive consequences of modernity. The structures of traditional society then recently made an object of study by the anthropologists were simultaneously passing out of existence in the “modern” world.

I draw this contrast to point out that our own experience of rapid and disorienting change has a history. However out of sorts we may feel as we pass through what may be justly called the digital revolution, it probably does not quite compare with the sense of displacement engendered by the technological revolutions of the late nineteenth and early twentieth century. I still tend to think that the passage from no electricity to near ubiquitous electrification is more transformative than the passage from no Internet to ubiquitous Internet. (But I could be persuaded otherwise.)

So when, in “You’ve Already Forgotten Yesterday’s Internet,” Philip Bump notes that the Internet is “a stunningly effective accelerant” that has rendered knowledge a “blur,” he is identifying the present position and velocity of a trajectory set in motion long ago. Of course, if Bump is right, and I think he is certainly in the ball park so far as his diagnosis is concerned, then this history is irrelevant since no one really remembers it anyway, at least not for long.

Bump begins his brief post by making a joke out of the suggestion that he was going to talk about Herodotus. Who talks about Herodotus? Who even knows who Herodotus was? The joke may ultimately be on us, but Bump is right. The stories that populated the Western imagination for centuries have been largely forgotten. Indeed, as Bump suggests, we can barely keep the last several months in mind, much less the distant past:

“The web creates new shared points of reference every hour, every minute. The growth is exponential, staggering. Online conversation has made reference to things before World War II exotic — and World War II only makes the cut because of Hitler.

Yesterday morning, an advisor to Mitt Romney made a comment about the Etch-A-Sketch. By mid-afternoon, both of his rivals spoke before audiences with an Etch-A-Sketch in hand. The Democratic National Committee had an ad on the topic the same day. The point of reference was born, spread — and became trite — within hours.”

Bump’s piece is itself over a week old, and I’m probably committing some sort of blogging sin by commenting on it at this point. But I’ll risk offending the digital gods of time and forgetting because he’s neatly captured the feel of Internet culture. But this brings us back to the origins of anthropology and the very idea of culture. Whatever we might mean by culture now, it has very little to do with the structures of traditional, “solid” societies that first filled the term with meaning. Our culture, however we might define it, is no longer characterized by the persistence of the past into the present.

I should clarify: our culture is no longer characterized by the acknowledged, normative persistence of the past into the present. By this clarification I’m trying to distinguish between the sense in which the past persists whether we know it or like it, and the sense in which the past persists because it is intentionally brought to bear on the present. The persistence of the past in the former sense is, as far as I can tell, an unavoidable feature of our being time-bound creatures. The latter, however, is a contingent condition that obtained in pre-modern societies to a greater degree, but no longer characterizes modern (or post-modern, if you prefer) society to the same extent.

Notably, our culture no longer trades on a stock of shared stories about the past. Instead (beware, massive generalizations ahead) we moved into a cultural economy of shared experience. Actually, that’s not quite right either. It’s not so much shared experience as it is a shared existential sensibility — affect.

I am reminded of David Foster Wallace’s comments on what literature can do:

“There’s maybe thirteen things, of which who even knows which ones we can talk about.  But one of them has to do with the sense of, the sense of capturing, capturing what the world feels like to us, in the sort of way that I think that a reader can tell ‘Another sensibility like mine exists.’  Something else feels this way to someone else.  So that the reader feels less lonely.”

Wallace goes on to describe the work of avant-garde or experimental literature as “the stuff that’s about what it feels like to live.  Instead of being a relief from what it feels like to live.”

When the objective content of culture, the stories for example, are marginalized for whatever myriad reasons, there still remains the existential level of lived experience which then becomes the object of analysis and comment. Talk about “what it feels like to be alive” now does the work shared stories accomplished in older cultural configurations. We’re all meta now because our focus has shifted to our own experience.

Consider the JFK assassination as a point of transition. It may be the first event about which people began to ask and talk about where they were when the event transpired. The story becomes about where I was when I heard the news. This is an indicator of a profound cultural shift. The event itself fades into the background as the personal experience of the event moves forward. The objectivity of the event becomes less important than the subjective experience. Perhaps this exemplifies a general societal trend. We may not exchange classical or biblical allusions in our everyday talk, but we can trade accounts of our anxiety and nostalgia that will ring broadly true to others.

We don’t all know the same stories, but we know what it feels like to be alive in a time when information washes over us indiscriminately. The story we share is now about how we can’t believe this or that event is already a year past. If feels as if it were just yesterday, or it feels as if it was much longer ago. In either case, what we feel is that we don’t have a grip on the passage of time or the events carried on the flood. Or we share stories about the anxiety that gripped us when we realized we had left our phone at home. This story resonates. That experience becomes our new form of allusion. It is not an allusion to literature or history, it is an allusion to shared existential angst.

The Wisdom of Gandalf for the Information Age

Tolkien is in the air again. In December of this year, the eagerly awaited first part of Peter Jackson’s The Hobbit will be released. The trailers for the film have kindled a great deal of excitement and the film promises to be a delight for fans of one of the most beloved stories ever written. By some estimates it is the fourth best selling book ever. Ahead of it are A Tale of Two Cities in the top spot, The Little Prince, and Tolkien’s own The Lord of the Rings trilogy.

I happily count myself among the Tokien devotees and I’ve declared this my very own Year of Tolkien. Basically this means that when I’m not reading some thing I must read, I’ll be reading through Tolkien’s works and a book or two about Tolkien.

Reading The Fellowship of the Ring several days ago, I was captivated once again by an exchange between the wizard Gandalf and the hobbit Frodo. If you’re familiar with the story, then the following needs no introduction. If you are not, you really ought to be, but here’s what you need to know to make sense of this passage. In The Hobbit, the main character, Bilbo, passes on an opportunity to kill a pitiable but sinister creature known as Gollum. The creature had been corrupted by the Ring of Power, which had been forged by an evil being known as Sauron. It would take too long to explain more, but in the following exchange Gandalf is retelling those events to Bilbo’s nephew Frodo.

Frodo has just proclaimed, “What a pity that Bilbo did not stab that vile creature, when he had a chance!”

Here is the ensuing exchange:

‘Pity? It was Pity that stayed his hand. Pity, and Mercy: not to strike without need …

‘I am sorry,’ said Frodo. ‘But I am frightened; and I do not feel any pity for Gollum.’

“You have not seen him,’ Gandalf broke in.

‘No, and I don’t want to,’ said Frodo. ‘I can’t understand you. Do you mean to say that you, and the Elves, have let him live on after all those horrible deeds? Now at any rate he is as bad as an Orc, and just an enemy. He deserves death.’

‘Deserves it! I daresay he does. Many that live deserve death. And some that die deserve life. Can you give it to them? Then do not be too eager to deal out death in judgement. For even the very wise cannot see all ends. I have not much hope that Gollum can be cured before he dies, but there is a chance of it. And he is bound up with the fate of the Ring. My heart tells me that he has some part to play yet, for good or ill, before the end; and when that comes, the pity of Bilbo may rule the fate of many — yours not least …’

This is, in my estimation, among the most evocative portions of a body of writing replete with wise and often haunting passages. Among the many things that could be said about this exchange and the many applications one could draw, I will note just one.

Gandalf is renowned for his wisdom, he is after all a great wizard. But in what does his wisdom lie? In this particular instance, it lies not in what he knows, rather it lies in his awareness of the limits of his knowledge. His wisdom issues from an awareness of his ignorance. “Even the very wise cannot see all ends,” he confesses. He does not have much hope for Gollum, but he simply does not know and he will not act, nor commend an action, that would foreclose the possibility of Gollum’s redemption. Moreover, he does not know what part Gollum may play in the unfolding story. For these reasons, which amount to an acknowledgement of his ignorance, Gandalf’s actions and judgments are tempered and measured.

These days we are enthralled by the information at our command. We are awestruck by what we know. The data available to us constitutes an embarrassment of riches. And yet one thing we lack: an awareness of how much we nevertheless do not know. We have forgotten our ignorance and we are judging and acting without the compassion or wisdom of Gandalf because we lack his acute awareness of the limitations of his knowledge.

We do not have to search very far at all to find those who will rush to judgment and act out of a profound arrogance. I will let you supply the examples. They are too many to list in any case. More than likely we need not look further than ourselves.

I have more than once cited T. S. Eliot’s lament from “Choruses from ‘The Rock'”:

“Where is the Life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?”

Our problem is that we tend to think of the passage from information to knowledge and on to wisdom as a series of aggregations. We accumulate enough information and we pass to knowledge and we accumulate enough knowledge and we pass to wisdom. The truth is that we pass to wisdom not by the aggregation of information or knowledge, both of which are available as never before; we pass to wisdom by remembering what we do not know. And this, in an age of information, seems to be the one thing we cannot keep in mind.

The Slippery Slope Is Greased By Risk Aversion

In a short post, Walter Russell Mead links to five stories under the heading of “Big Brother Goes Hi-Tech.” Click over for links to stories about face scanners in Japan, phone-monitoring in Iran, “infrared antiriot cameras” in China, and computer chips embedded in the uniforms of school children in Brazil.

The stories from Japan, China, and Iran may be the most significant (and disconcerting) in terms of the scope of the reach of the technologies in question, but it was the item from Brazil that caught my attention. Perhaps because it involved school children and we are generally more troubled by problematic developments that directly impact children. The story to which Mead linked appeared in the NY Times and it amounts to little more than a brief note. Here is the whole of it:

“Grade-school students in a northeastern Brazilian city are using uniforms embedded with computer chips that alert parents if they are cutting classes, the city’s education secretary, Coriolano Moraes, said Thursday. Twenty-thousand students in 25 of Vitória da Conquista’s 213 public schools started using T-shirts with chips this week, Mr. Moraes said. By 2013, all of the city’s 43,000 public school students will be using them, he added. The chips send a text message to the cellphones of parents when their children enter the school or alert the parents if their children have not arrived 20 minutes after classes have begun. The city government invested $670,000 in the project, Mr. Moraes said.”

So what to make of this? It differs from the technologies being deployed in China, Japan, and Iran in that it is being implemented in the light of day.

[Curious side note: I misspelled “implemented” in the sentence above and it was auto-corrected to read “implanted”. Perhaps the computers know something we don’t!]

On the face of it, there is nothing secretive about this program and I would be surprised if there was not some kind of opt out provision for parents. Also, from this short notice it is unclear whether the augmented T-shirts can be tracked or if they simply interact with a sensor on school premises and are inactive outside of school grounds. If the technology could be used to track the child’s location outside of school it would be more problematic.

Or perhaps it might be more attractive. The same impulse that would sanction these anti-truancy T-shirts, taken further along the path of its own logic, would also seem to sanction technology that tracks a child’s location at all times. It is all about safety and security, at least that is how it would be presented and justified. It would be the ultimate safe-guard against kidnapping. It would also solve, or greatly mitigate the problem of children wandering off on their own and finding themselves lost. Of course, clothing can be removed from one’s person, which to my mind opens up all kinds of flaw’s with the Brazilian program. How long will it take clever teenagers to figure out all sorts of ways to circumventing this technology, really?

Recalling the auto-correct hint, then, it would seem that the answer to this technology’s obvious design flaw would be to imbed the chips subcutaneously. We already do it with our pets. Wouldn’t it be far more tragic to lose a child than to lose a pet?

Now, seriously, how outlandish does this sound at this techno-social juncture we find ourselves in? Think about it. Is it just me or does it not seem as if we are passed the point where we would be shocked by the possibility of implanted chips. I’m sure there is a wide spectrum of opinion on such matters, but the enthusiasts are not exactly on the fringes.

Consider the dynamic that Thomas de Zengotita has labeled “Justin’s Helmet Principle.” Sure Justin looks ridiculous riding down the street with his training wheels on, more pads than a lineman, and a helmet that makes him look like Marvin the Martian, but do I want the burden of not decking out Justin in this baroque assemblage of safety equipment, have him fall, and seriously injure himself?  No probably not.  So on goes the safety crap.

Did we sense that there was something a little off when we started sending off our first graders to school with cell phones, just a fleeting moment of incongruity perhaps?  Maybe.  Did we dare risk not giving them the cell phone and have them get lost or worse without a way of getting help?  Nope.  So there goes Johnny with the cell phone.

And in the future we might add, did we think it disconcerting when we first started implanting chips in our children. Definitely, but did we want to risk having them be kidnapped or lost and not be able to find them? No, of course not.

The slippery slope is greased by the oil of risk aversion.

Thoughts?

The Protest Will Be Pervasively Documented

The picture below was taken Wednesday night by Ta-Nehisi Coates, a blogger at The Atlantic. The picture depicts one slice of the Million Hoodie March for Trayvon Martin that took place in New York City. It is a parable of our age.

Image by Ta-Nehisi Coates (click image for source)

To begin with, as Coates put it in a previous tweet, “Crazy just a week ago this was just a few bloggers reporters and activists.” In the age of digital/social media, things move quickly. A great deal, of course, has already been written on social media and protests or even revolution, particularly with reference to the Arab Spring. But that is not what I want to focus on here.

What struck me in this image was the double-mirror-like effect of a photograph of people photographing an event. In short, we live in an age of pervasive documentation. What difference does this make, that’s the question.

What difference does it make for the marchers? What difference does it make for the present spectators? What difference does it make for those who experience (if that’s the right word) the event through its pervasive documentation? Thorough answers to those questions would probably yield book length discussions. In the space of a blog post, I’ll offer a few tentative considerations that suggest themselves to me.

First, of course, the march becomes a performance. Naturally, all such protests and marches have always been performative, that is their nature. So perhaps it is best to say that they are performative in a different key. Or, maybe that all such actions were in the past symbolic actions and now they are symbolic actions and something more. What is that something more though? I’m not sure, I’m not a student of protest movements, but I would venture to say the solidity of the record yielded by pervasive documentation foregrounds the individual dimension of the action. The gaze of the camera hails the participant in their potentially distinguishable individuality.

In other words, if the march was only being witnessed and not documented, as a participant the experience would feel more collective. The self is subordinated to the crowd in its collective symbolic gesture. When the symbolic action is being pervasively documented, the self then comes to the fore because my image, my face now becomes potentially distinguishable when the documented record becomes (more) permanent and public.

This is, I should make clear, entirely speculative on my part.

Secondly, for the witnesses present at the scene, those seen documenting the event in Coates’ photograph, the act of documentation changes the experience of witnessing. The witness becomes a documenter, and the two are not equivalent. The consequence, it would seem, is that the event is objectified by the process of documentation. It is no longer an event in whose thrall I am in, it is now an object to me that I seek to capture. This has the effect of stripping the symbolic action of its power, to some degree. The witness is in some sense in the event, while the documenter stands outside of it. I would also suggest that the documenting stance is also in some sense a rationalist, or rationalizing stance. It is creates a different mode of experience — removed, quasi-analytic, detached.

At the same time, however, it may also yield a heightened sense of participation of a different kind. To tweet a picture, for example, or post one to Facebook adds to the totality of the event. The witness has not only witnessed, but also acted in a way that expands the reach of the event. But the action is complex because in a sense the focus is no longer necessarily on the event and its symbolic power, but rather on the witness/documenter’s presence at the event.

Perhaps we could speak of such symbolic actions before pervasive documentation as enacted (rather than preformed) and such actions in an age of pervasive documentation as both enacted and performed. The former locks the participants and witnesses into a mutually constituted field of experience. The latter disintegrates the field into its individual parts, making the participants more conscious of their selves as actors in a spectacle. This is further complicated and heightened when the participants are self-documenting their own participation, which while not capture by this particular photograph as far as I can tell, undoubtedly happened. Also in the latter case, the witnesses are likewise distanced from the event as individuals documenting and publicizing.

The parting question, then, is still what difference does this all make. What difference does it make for the power and influence of such events?