Agency and Embodiment

If you’ve been following this blog over the last few months, you may justly be wondering if there is any unifying thread to what I post. The answer is: sort of. Clearly the vast majority involves “technology” in the broadest sense, but I would say that there are a few more specific unifying threads (in my mind at least). One of those threads will (hopefully) tie together the posts on embodiment, place, and the worlds’ fairs. How neatly I’m able to tie that thread remains to be seen.

My reading of Carrie Noland’s Agency and Embodiment contributes to this particular thread and I’ll be posting a few excerpts with minimal comment in the coming days (tumblr style). In this post, I’m picking out some portions from the Introduction that will give you a feel for her project and the approach she takes to it. I’m particularly sympathetic to the manner in which she forges a third way through certain perennially intractable oppositions.

One last note before the excerpts. There is also significant rhetorical variety among the sources that I cite on this blog. They range from straightforward, clear journalistic prose to more obscure, academic theoretical writing. Noland’s text veers toward the latter, but for someone dealing with theory in the French phenomenological tradition, she writes with surprising clarity. That said, this will not be everyone’s cup of tea, of course.

Here’s Noland’s statement of her thesis:

“If bodily motility is, as Henri Bergson once claimed, the single most important filtering device in the subject’s negotiations with the external world, then a theory of agency that places movement center stage is essential to understanding how human beings are embodied within — and impress themselves on — their worlds.

The hypothesis I advance in this book is that kinesthetic experience, produced by acts of embodied gesturing, places pressure on the conditioning a body receives, encouraging variations in performance that account for larger innovations in cultural practice that cannot otherwise be explained.”

She adds:

“In these pages I will speak of ‘variations in performance’ and not only instances of ‘resistance,’ in order to avoid the agonistic overtones of Michel Foucault’s highly influential but largely binary account of power, which reduces the field of cultural practices to techniques of ‘strict subjection.'”

In other words, the experience of having, or perhaps better, being a body creates the conditions for the possibility of agency within the fields that operate to constrain and form our subjectivity.

More from Noland:

“Kinesthetic sensations are a particular kind of affect belonging both to the body that precedes our subjectivity (narrowly construed) and the contingent, cumulative subjectivity our body allows us to build over time. Because these sensations are also preserved as memories, they help constitute the ’embodied history of the subject,’ a history stored in gestural ‘I can’s’ that determines in large part how that embodiment will continue to unfold. Kinesthesia allows us to correct recursively, refine, and experiment with the practices we have learned. The knowledge obtained through kinesthesia is thus constitutive of — not tangential to — the process of individuation.”

“You know, like when you realize you left your phone at home …”

The discipline of anthropology cut its teeth on the study of cultures that were deemed “primitive” and exotic by the standards of nineteenth century Western, industrialized society. North American and European nations were themselves undergoing tremendous transformations wrought by the advent of groundbreaking new technologies — the steam engine, railroad, and telegraph, to name just three. These three alone dramatically reordered the realms of industry, transportation, and communication. Altogether they had the effect of ratcheting up the perceived pace of cultural evolution. Meanwhile, the anthropologists studied societies in which change, when it could be perceived, appeared to proceed at a glacial pace. Age-old ritual and tradition structured the practice of everyday life and a widely known body of stories ordered belief and behavior.

“All that is solid melts into air, and all that is holy is profaned …” — so wrote Marx and Engels in 1848. The line evocatively captures the erosive consequences of modernity. The structures of traditional society then recently made an object of study by the anthropologists were simultaneously passing out of existence in the “modern” world.

I draw this contrast to point out that our own experience of rapid and disorienting change has a history. However out of sorts we may feel as we pass through what may be justly called the digital revolution, it probably does not quite compare with the sense of displacement engendered by the technological revolutions of the late nineteenth and early twentieth century. I still tend to think that the passage from no electricity to near ubiquitous electrification is more transformative than the passage from no Internet to ubiquitous Internet. (But I could be persuaded otherwise.)

So when, in “You’ve Already Forgotten Yesterday’s Internet,” Philip Bump notes that the Internet is “a stunningly effective accelerant” that has rendered knowledge a “blur,” he is identifying the present position and velocity of a trajectory set in motion long ago. Of course, if Bump is right, and I think he is certainly in the ball park so far as his diagnosis is concerned, then this history is irrelevant since no one really remembers it anyway, at least not for long.

Bump begins his brief post by making a joke out of the suggestion that he was going to talk about Herodotus. Who talks about Herodotus? Who even knows who Herodotus was? The joke may ultimately be on us, but Bump is right. The stories that populated the Western imagination for centuries have been largely forgotten. Indeed, as Bump suggests, we can barely keep the last several months in mind, much less the distant past:

“The web creates new shared points of reference every hour, every minute. The growth is exponential, staggering. Online conversation has made reference to things before World War II exotic — and World War II only makes the cut because of Hitler.

Yesterday morning, an advisor to Mitt Romney made a comment about the Etch-A-Sketch. By mid-afternoon, both of his rivals spoke before audiences with an Etch-A-Sketch in hand. The Democratic National Committee had an ad on the topic the same day. The point of reference was born, spread — and became trite — within hours.”

Bump’s piece is itself over a week old, and I’m probably committing some sort of blogging sin by commenting on it at this point. But I’ll risk offending the digital gods of time and forgetting because he’s neatly captured the feel of Internet culture. But this brings us back to the origins of anthropology and the very idea of culture. Whatever we might mean by culture now, it has very little to do with the structures of traditional, “solid” societies that first filled the term with meaning. Our culture, however we might define it, is no longer characterized by the persistence of the past into the present.

I should clarify: our culture is no longer characterized by the acknowledged, normative persistence of the past into the present. By this clarification I’m trying to distinguish between the sense in which the past persists whether we know it or like it, and the sense in which the past persists because it is intentionally brought to bear on the present. The persistence of the past in the former sense is, as far as I can tell, an unavoidable feature of our being time-bound creatures. The latter, however, is a contingent condition that obtained in pre-modern societies to a greater degree, but no longer characterizes modern (or post-modern, if you prefer) society to the same extent.

Notably, our culture no longer trades on a stock of shared stories about the past. Instead (beware, massive generalizations ahead) we moved into a cultural economy of shared experience. Actually, that’s not quite right either. It’s not so much shared experience as it is a shared existential sensibility — affect.

I am reminded of David Foster Wallace’s comments on what literature can do:

“There’s maybe thirteen things, of which who even knows which ones we can talk about.  But one of them has to do with the sense of, the sense of capturing, capturing what the world feels like to us, in the sort of way that I think that a reader can tell ‘Another sensibility like mine exists.’  Something else feels this way to someone else.  So that the reader feels less lonely.”

Wallace goes on to describe the work of avant-garde or experimental literature as “the stuff that’s about what it feels like to live.  Instead of being a relief from what it feels like to live.”

When the objective content of culture, the stories for example, are marginalized for whatever myriad reasons, there still remains the existential level of lived experience which then becomes the object of analysis and comment. Talk about “what it feels like to be alive” now does the work shared stories accomplished in older cultural configurations. We’re all meta now because our focus has shifted to our own experience.

Consider the JFK assassination as a point of transition. It may be the first event about which people began to ask and talk about where they were when the event transpired. The story becomes about where I was when I heard the news. This is an indicator of a profound cultural shift. The event itself fades into the background as the personal experience of the event moves forward. The objectivity of the event becomes less important than the subjective experience. Perhaps this exemplifies a general societal trend. We may not exchange classical or biblical allusions in our everyday talk, but we can trade accounts of our anxiety and nostalgia that will ring broadly true to others.

We don’t all know the same stories, but we know what it feels like to be alive in a time when information washes over us indiscriminately. The story we share is now about how we can’t believe this or that event is already a year past. If feels as if it were just yesterday, or it feels as if it was much longer ago. In either case, what we feel is that we don’t have a grip on the passage of time or the events carried on the flood. Or we share stories about the anxiety that gripped us when we realized we had left our phone at home. This story resonates. That experience becomes our new form of allusion. It is not an allusion to literature or history, it is an allusion to shared existential angst.

The Wisdom of Gandalf for the Information Age

Tolkien is in the air again. In December of this year, the eagerly awaited first part of Peter Jackson’s The Hobbit will be released. The trailers for the film have kindled a great deal of excitement and the film promises to be a delight for fans of one of the most beloved stories ever written. By some estimates it is the fourth best selling book ever. Ahead of it are A Tale of Two Cities in the top spot, The Little Prince, and Tolkien’s own The Lord of the Rings trilogy.

I happily count myself among the Tokien devotees and I’ve declared this my very own Year of Tolkien. Basically this means that when I’m not reading some thing I must read, I’ll be reading through Tolkien’s works and a book or two about Tolkien.

Reading The Fellowship of the Ring several days ago, I was captivated once again by an exchange between the wizard Gandalf and the hobbit Frodo. If you’re familiar with the story, then the following needs no introduction. If you are not, you really ought to be, but here’s what you need to know to make sense of this passage. In The Hobbit, the main character, Bilbo, passes on an opportunity to kill a pitiable but sinister creature known as Gollum. The creature had been corrupted by the Ring of Power, which had been forged by an evil being known as Sauron. It would take too long to explain more, but in the following exchange Gandalf is retelling those events to Bilbo’s nephew Frodo.

Frodo has just proclaimed, “What a pity that Bilbo did not stab that vile creature, when he had a chance!”

Here is the ensuing exchange:

‘Pity? It was Pity that stayed his hand. Pity, and Mercy: not to strike without need …

‘I am sorry,’ said Frodo. ‘But I am frightened; and I do not feel any pity for Gollum.’

“You have not seen him,’ Gandalf broke in.

‘No, and I don’t want to,’ said Frodo. ‘I can’t understand you. Do you mean to say that you, and the Elves, have let him live on after all those horrible deeds? Now at any rate he is as bad as an Orc, and just an enemy. He deserves death.’

‘Deserves it! I daresay he does. Many that live deserve death. And some that die deserve life. Can you give it to them? Then do not be too eager to deal out death in judgement. For even the very wise cannot see all ends. I have not much hope that Gollum can be cured before he dies, but there is a chance of it. And he is bound up with the fate of the Ring. My heart tells me that he has some part to play yet, for good or ill, before the end; and when that comes, the pity of Bilbo may rule the fate of many — yours not least …’

This is, in my estimation, among the most evocative portions of a body of writing replete with wise and often haunting passages. Among the many things that could be said about this exchange and the many applications one could draw, I will note just one.

Gandalf is renowned for his wisdom, he is after all a great wizard. But in what does his wisdom lie? In this particular instance, it lies not in what he knows, rather it lies in his awareness of the limits of his knowledge. His wisdom issues from an awareness of his ignorance. “Even the very wise cannot see all ends,” he confesses. He does not have much hope for Gollum, but he simply does not know and he will not act, nor commend an action, that would foreclose the possibility of Gollum’s redemption. Moreover, he does not know what part Gollum may play in the unfolding story. For these reasons, which amount to an acknowledgement of his ignorance, Gandalf’s actions and judgments are tempered and measured.

These days we are enthralled by the information at our command. We are awestruck by what we know. The data available to us constitutes an embarrassment of riches. And yet one thing we lack: an awareness of how much we nevertheless do not know. We have forgotten our ignorance and we are judging and acting without the compassion or wisdom of Gandalf because we lack his acute awareness of the limitations of his knowledge.

We do not have to search very far at all to find those who will rush to judgment and act out of a profound arrogance. I will let you supply the examples. They are too many to list in any case. More than likely we need not look further than ourselves.

I have more than once cited T. S. Eliot’s lament from “Choruses from ‘The Rock'”:

“Where is the Life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?”

Our problem is that we tend to think of the passage from information to knowledge and on to wisdom as a series of aggregations. We accumulate enough information and we pass to knowledge and we accumulate enough knowledge and we pass to wisdom. The truth is that we pass to wisdom not by the aggregation of information or knowledge, both of which are available as never before; we pass to wisdom by remembering what we do not know. And this, in an age of information, seems to be the one thing we cannot keep in mind.

The Slippery Slope Is Greased By Risk Aversion

In a short post, Walter Russell Mead links to five stories under the heading of “Big Brother Goes Hi-Tech.” Click over for links to stories about face scanners in Japan, phone-monitoring in Iran, “infrared antiriot cameras” in China, and computer chips embedded in the uniforms of school children in Brazil.

The stories from Japan, China, and Iran may be the most significant (and disconcerting) in terms of the scope of the reach of the technologies in question, but it was the item from Brazil that caught my attention. Perhaps because it involved school children and we are generally more troubled by problematic developments that directly impact children. The story to which Mead linked appeared in the NY Times and it amounts to little more than a brief note. Here is the whole of it:

“Grade-school students in a northeastern Brazilian city are using uniforms embedded with computer chips that alert parents if they are cutting classes, the city’s education secretary, Coriolano Moraes, said Thursday. Twenty-thousand students in 25 of Vitória da Conquista’s 213 public schools started using T-shirts with chips this week, Mr. Moraes said. By 2013, all of the city’s 43,000 public school students will be using them, he added. The chips send a text message to the cellphones of parents when their children enter the school or alert the parents if their children have not arrived 20 minutes after classes have begun. The city government invested $670,000 in the project, Mr. Moraes said.”

So what to make of this? It differs from the technologies being deployed in China, Japan, and Iran in that it is being implemented in the light of day.

[Curious side note: I misspelled “implemented” in the sentence above and it was auto-corrected to read “implanted”. Perhaps the computers know something we don’t!]

On the face of it, there is nothing secretive about this program and I would be surprised if there was not some kind of opt out provision for parents. Also, from this short notice it is unclear whether the augmented T-shirts can be tracked or if they simply interact with a sensor on school premises and are inactive outside of school grounds. If the technology could be used to track the child’s location outside of school it would be more problematic.

Or perhaps it might be more attractive. The same impulse that would sanction these anti-truancy T-shirts, taken further along the path of its own logic, would also seem to sanction technology that tracks a child’s location at all times. It is all about safety and security, at least that is how it would be presented and justified. It would be the ultimate safe-guard against kidnapping. It would also solve, or greatly mitigate the problem of children wandering off on their own and finding themselves lost. Of course, clothing can be removed from one’s person, which to my mind opens up all kinds of flaw’s with the Brazilian program. How long will it take clever teenagers to figure out all sorts of ways to circumventing this technology, really?

Recalling the auto-correct hint, then, it would seem that the answer to this technology’s obvious design flaw would be to imbed the chips subcutaneously. We already do it with our pets. Wouldn’t it be far more tragic to lose a child than to lose a pet?

Now, seriously, how outlandish does this sound at this techno-social juncture we find ourselves in? Think about it. Is it just me or does it not seem as if we are passed the point where we would be shocked by the possibility of implanted chips. I’m sure there is a wide spectrum of opinion on such matters, but the enthusiasts are not exactly on the fringes.

Consider the dynamic that Thomas de Zengotita has labeled “Justin’s Helmet Principle.” Sure Justin looks ridiculous riding down the street with his training wheels on, more pads than a lineman, and a helmet that makes him look like Marvin the Martian, but do I want the burden of not decking out Justin in this baroque assemblage of safety equipment, have him fall, and seriously injure himself?  No probably not.  So on goes the safety crap.

Did we sense that there was something a little off when we started sending off our first graders to school with cell phones, just a fleeting moment of incongruity perhaps?  Maybe.  Did we dare risk not giving them the cell phone and have them get lost or worse without a way of getting help?  Nope.  So there goes Johnny with the cell phone.

And in the future we might add, did we think it disconcerting when we first started implanting chips in our children. Definitely, but did we want to risk having them be kidnapped or lost and not be able to find them? No, of course not.

The slippery slope is greased by the oil of risk aversion.

Thoughts?