Teaching What It Feels Like To Be Alive

… it’s the stuff that’s about what it feels like to live.  Instead of being a relief from what it feels like to live.

That is how David Foster Wallace, in Although of Course You End Up Becoming Yourself, contrasted traditional literature with its coherent narrative and a satisfying sense of closure, to experimental or avant-garde literature which typically exhibits neither.  I’ve been thinking about that contrast since I posted the passage a few weeks ago.  Writing that is experienced as a relief from what it feels like to be alive and writing that reflects what it feels like to be alive — I’m wondering if that same distinction could also be usefully applied to teaching.  Can teaching, in the same way, reflect what it feels like to be alive, rather than be a relief from it?

Literature and teaching are both components of the ongoing, ramshackle project we call our education.   When I am most hopeful about what a teacher can do, I see it as not unlike what a very good book might also accomplish.  We might describe it as the opening up of new and multiple vistas into both the world and ourselves.  A good book offers a challenging engagement with reality, rather than the mere escapism that some literature proffers instead.  To borrow a line from Bridge to Terabithia, good teaching, likewise, pushes students to see beyond their own secret countries, to see and to feel what lies beyond and within.  Of course, on my less hopeful (read, more curmudgeonly) days, I feel that convincing students that a book can work in that way is itself the necessary task.

What, then, might it mean to teach so as to reflect what it feels like to be alive?

For one thing, it involves feeling; it is affective.  It reaches beyond the transfer of information to the mind, and seeks to move the heart as well.  This matters principally because while we go about the work and play of living we tend to lead with our hearts and not with our minds (for better and/or for worse).

But in order to move the heart, the heart must be susceptible to being moved.  The numbness that threatens always to settle on us as wave upon wave of stimulation washes over us gently massaging us into a state of mildly amused indifference to reality must be overcome.  This numbness itself might be self-protective, but, while self-knowledge has a distinguished place in the history of education, self-preservation seems a less noble aspiration.  Teaching that leads to feeling must find a way to break this through this self-protective numbness.  Of course, that numbness is itself part of what it feels like to be alive, but it is the part that must first be encountered, acknowledged, and transcended in order to feel all the rest.

Like the artist in Wallace’s view, the teacher has the license and the responsibility

to sit, clench their fists, and make themselves be excruciatingly aware of the stuff that we’re mostly aware of only on a certain level.  And that if the writer [or teacher] does his job right, what he basically does is remind the reader [or student] of how smart the reader [or student] is.

The teacher, like the writer, must themselves be sensitive to what it feels like to be alive so as to teach to that feeling and help students understand it, understand themselves.  Perhaps it is precisely here that teaching has failed students, in the inability to enter into the student’s world so as to speak meaningfully into it.

The trick, of course, is also to do so without falling into the equivalent of what Wallace calls “shitty avant garde,” literature that tries too hard and ignores the reader in its effort to be profound. Trying too hard to achieve this effect without authenticity is fatal.  Likewise with teaching.  Watching Lean on Me or Dead Poet’s Society one too many times will likely do more harm than good.

Good writing and good teaching are both grounded in a deep respect for the reader and the student, not in an inordinate desire to be inspiring.  This is what finally stuck me most forcefully in Wallace’s comments.  His work, his estimation of what literature could do, flowed from a remarkable confidence in the reader.  Perhaps then this is also where good teaching must begin, with an equal respect for and confidence in the student.

Social Media and the Arts of Memory

UPDATE: A complete and updated form of the essay began here is now available here.

Early in The Art of Memory, Frances Yates pauses to envision a “forgotten social habit” of the classical world.  She invites us to wonder,“Who is that man moving slowly in the lonely building, stopping at intervals with an intent face?”  That man, Yates tells us, is a “rhetoric student forming a set of memory loci.”  The rhetoric student would have been creating the architecture of a mental space into which they would then place vivid images imaginatively designed to recollect the themes or words of a prepared oration.  While delivering the oration, the rhetor would navigate the mental space coming upon each carefully placed image which triggered their memory accordingly.  This work of designing mental space and populating the space with striking images followed the prescriptions of the artificial memory techniques widely practiced in classical antiquity.

What if, however, we updated Yates’ scene by setting it in the present?  The scene would be instantly recognizable as long as we equipped our imagined person with a cell phone.  The stopping at intervals and the intent face would correspond to any of the multiple uses to which an Internet-enabled smart phone may be put:  reading or sending a text message, downloading songs, taking or sending pictures and video, updating social media profiles, or finding directions with GPS, to name but a few.  What is striking is how often these activities would, like that of the ancient rhetor, involve the work of memory.   Much of what cell phones are increasingly used for has very little to do with making a phone call, after all. In fact, one could argue that the calling feature of phones is becoming largely irrelevant.  Cell phones are more likely to be used to access the Internet, send a text message, take a picture, or film a video.  Given these capabilities cell phones have become prosthetic memory devices; to lose a cell phone would be to induce a state of partial amnesia.  Or, it may be better to say it would might induce a fear of future amnesia since our ability to approach the present as a field of potential memories would be undermined.

Social networking sites (SNS) are of special interest because of the way in which they explicitly trade in memory.  As Joanne Garde-Hansen asks in “MyMemories?:  Personal Digital Archive Fever and Facebook,” “If memory and identity can be seen as coterminous and SNSs involve literally digitising ones’ self into being then what is at stake when memories and identities are practiced on Facebook?”

She goes on to add,

It goes without saying that the allure of the site is in its drawing together in one place memory practices: creating photograph albums, sharing photographs, messaging, joining groups and alumni memberships, making ‘friends’ and staying in touch with family.

It would be fair to acknowledge that SNS such as Facebook traffic in more than the allure of memory practices. Nonetheless, the production, maintenance, and retrieval of memory is integral to the practices deployed on SNSs.

Following Jacques Derrida, Garde-Hansen considers Facebook as an instance of the archiving archive. Thus, she points out, the architecture of a SNS such as Facebook is not neutral with respect to the memories it archives.  As Derrida observed,

… the technical structure of the archiving archive also determines the structure of the archivable content even in its very coming into existence and in its relationship to the future.  The archivization produces as much as it records the event.

Garde-Hansen also draws SNSs into the conflict between database and narrative staged by Manovich in The Language of New Media.  In her view, the most significant aspect of Manovich’s analysis of new media for SNSs is the comparison he draws between the visual organization of new media interfaces and spatial montage.  “Manovich’s emphasis upon spatial narrative is,” according to Garde-Hansen, “extremely pertinent to thinking through the emergence of SNSs and how these sites remediate personal and collective memory.” Framed in this way, memory as spatial montage challenges “the rise and dominance of history,” the “power of the written word” to order the past temporally, and the “twentieth century’s emphasis upon the single, unadulterated image (think cinema).”

Derrida’s insight suggests that given the way the architecture of an archive already determines what can in fact be archived, the future record of the past is already impinging upon the present.  Or, put otherwise, the sorts of memories we are able to generate with social media may already be directing our interactions in the present.  (For an insightful discussion of this point anchored on an analysis of faux-vintage photography see Nathan Jurgenson’s, “The Faux-Vintage Photo.”)  Drawing in Manovich’s database/narrative opposition, suggests that the visual/spatial mode of ordering memories on SNSs potentially shifts how meaning is derived from memory and how we construct the self.  We’ll explore both of these considerations a little further in subsequent posts.

Returning to the scene suggested by Yates, however, we may also consider SNSs such as Facebook as instances of new artificial memory spaces constructed to supplement and augment the natural memory.  Already in the artificial memory tradition we have memory rendered spatially and visually in a manner that anticipates the representation and organization of memory in SNSs.  Situating SNSs within the long history of spatial and visual memory also affords us the opportunity to consider SNSs in the context of a complex and rich tradition of philosophical reflection.

What emerges is a history of memory practices that alternate between a Platonic focus on memory as the presence of an absence and an Aristotelian emphasis on memory as the record of the past.  There are several thematic threads that weave this story together including the opposition of internal memory to memory supported by inscription, the connection between memory and imagination, memory as the index of desire, the related tensions between space and time and databases and narratives, and the relationship of memory to identity.  Yet for all the complexity those themes introduce, we will begin our next post in this series with a story.

___________________________________________________

Joanne Garde-Hansen’s article is found in Save As… Digital Memories.

Resisting Disposable Reality

Technology and consumerism coalesce to create disposable reality.  Let’s try that idea on for a moment by drawing together observations made about each by Albert Borgmann and William Cavanaugh respectively.

Writing about technological culture, Borgmann distinguished between devices characterized by a “commodious,” accessible surface and a hidden, opaque machinery below the surface on the one hand and what he calls focal things on the other.  Devices are in turn coupled with consumption and focal things are paired with focal practices.  Focal things and practices, according to Borgmann, “gather our world and radiate significance in ways that contrast with the diversion and distraction afforded by commodities.”  In short, we merely use devices while we engage with focal things.

With those distinctions in mind, Borgmann continues, “Generally, a focal thing is concrete and of commanding presence.”   A commanding presence or reality is later opposed to “a pliable or disposable reality.”  Further on still, Borgmann writes, “Material culture in the advanced industrial democracies spans a spectrum from commanding to disposable reality.  The former reality calls forth a life of engagement that is oriented within the physical and social world.  The latter induces a life of distraction that is isolated from the environment and from other people.”  On that last point, bear in mind that Borgmann is writing in the early 2000s before the onset of social media.  (Although, it is debatable whether or not his point still stands.)

Borgmann then addresses his analysis to human desire by noting that:

To the dissolution of commanding reality corresponds on the human side a peculiar restlessness.  Since every item of cyberpresence can be x-rayed, zoomed into, overlayed, and abandoned for another more promising site, human desire is at every point at once satiated, disappointed, and aroused to be once more gorged, left hungry, and spurred on.

Writing about contemporary consumerism, William T. Cavanaugh observes, “What really characterizes consumer culture is not attachment to things but detachment.  People do not hoard money; they spend it.  People do not cling to things; they discard them and buy other things.”  Furthermore, Cavanaugh adds, “Consumerism is not so much about having more as it is about having something else; that’s why it is not simply buying but shopping that is the heart of consumerism.  Buying brings a temporary halt to the restlessness that typifies consumerism.”

Both Borgmann and Cavanaugh have identified an analogous pattern at the heart of both contemporary technology and the consumerist spirit:  both render reality essentially disposable.  Both also note how this disposable quality yields a restlessness or unsettledness that permeates our experience.  This experience of reality as essentially disposable and its attendant restlessness are characteristic of what sociologist Zygmunt Bauman has termed, “liquid modernity.”

Interestingly, one of the focal things identified by Borgmann is the book with its corresponding focal practice, reading.  While Cavanaugh did not make this observation, it seems to me that the book as object is one of the few commodities that resists his analysis of contemporary consumerism.  That is to say that books tend to be purchased and kept.  There are exceptions, of course.  Many books turn out not to be worth keeping.  We trade some books at used books stores for others.  We also now sometimes sell certain books through services provided by Amazon.com and the like.  Nonetheless, I would venture to say that those who purchase books often do so with an eye to keeping them.  Where we would typically encounter detachment, with the book we find a measure of attachment.  In a sea of technological, consumerist flux the book is a fixed point. It is an object that is engaged and not merely used, it is possessed rather than readily disposed; and perhaps, in modest measure, it tacitly alleviates our restlessness.

Perhaps this then provides one angle of approach to the analysis of electronic books and e-readers.  Consider Matt Henderson’s recent observations regarding his children’s experience of “reading” Al Gore’s Our Choice, “Push Pop Press’s highly-anticipated first interactive book.”  Henderson introduced Our Choice to his two children whom he describes as technologically savvy readers.

I showed them Our Choice, and just observed. They quickly figured out the navigation, and discovered all the interactive features. But… they didn’t read the content. Fascinated, they skipped through the book, hunting for the next interactive element, to see how it works. They didn’t completely watch a single video.

When they finished, I asked them to tell me about the book. They described how they could blow on the screen and see the windmill turn, how they could run their fingers across the interactive map and see colors changing. How they could pinch to open and close images. But they couldn’t recall much of what the book was about. They couldn’t recall the message intended to be communicated in any of the info-graphics (though they could recall, in detail, how they worked.)

Run through Borgmann’s grid this seems to be an instance of contrast between a focal thing with its attendant practice and a device  with its attendant consumption. The Kindle comes off better in Henderson’s analysis, and in his children’s experience, and this makes sense since the Kindle’s interface lends itself more readily to focused engagement.  And yet, the Kindle fails to provide the physical presence of books we keep which seems to be not insignificant as we search for anchors in an environment of manufactured restlessness and disposable realities.  To borrow a line from T. S. Eliot, nostalgia for the book in this case is just our pursuit of a “still point of the turning world.”

____________________________________________________________

Borgmann’s comments are drawn from Power Failure.

Cavanaugh’s comments are drawn from Being Consumed.

Henderson’s comments via Alan Jacobs.

Digital Legacies, Online Souls, and the Burdens of Remembering

Here is a story that came out of Melrose, Massachusetts yesterday:

According to the Boston Globe, no fewer than 11 Melrose varsity athletes were recently identified in illegal possession of alcohol or tobacco in photos which first surfaced on Facebook. The photos were taken from the site by a concerned parent, transferred to a thumb drive and submitted to the school’s administration as proof of inappropriate actions by the student body.

By now these stories are not too surprising; the only surprise may be in the fact that people are still being this careless with social media and incriminating or embarrassing images.  But then we realize that we are still learning to cope with what our media has enabled; to paraphrase W. H. Auden, we clearly cannot understand, what we can clearly do.  Writing in New Scientist, Sumit Paul-Choudhury recently observed,

Right now, though, we are living through a truly unique period in human history. We have learned how to create vast digital legacies but we don’t yet know how to tidy them up for our successors. The generations that went before us left no digital trace, and the ones that go after might leave nothing but sanitised “authorised biographies”. We will be defined through piecemeal and haphazard collections of our finest and foulest moments.

Paul-Choudhury’s article, “Digital legacy: The fate of your online soul,” explores how we are learning to manage the collection of digital data which becomes our online legacy, or what Hans-Peter Brondmo has evocatively called our “digital soul.”  Paul-Choudhury is, perhaps, more suited than most to write about this issue since six years ago he created a site to commemorate the life of his wife who passed away after a long battle with cancer.  Then it may have seemed a bit strange to wonder about one’s digital legacy, by now there are conferences and organizations devoted to the matter and plenty of  media attention as well.

In his article, Paul-Choudhury argues that we have not quite figured out how to manage the wealth of information that will survive us, and he identifies two principle approaches that have evolved in recent years:  the “preservationists” and the “deletionists.”  The names, I suspect, are self-explanatory.  Some argue that we should seek to save and preserve every last bit of data that we can, and others suggest that the more humane approach involves learning how to allow the possibility of forgetting.

The most prominent name on either side is probably Viktor Mayer-Schonberger (this post, incidentally, is apparently evolving a hyphenated name theme), the author of the 2009 book, Delete: The Virtue of Forgetting in the Digital Age which argued for the wisdom of creating tools that enabled the gradual erasure of online data over time.  He recommends that digital data be constructed so as to “digitally rust” to allow for a graceful forgetting that mimics the ordinary workings of human memory.  In his view, the problem is that we are now remembering too much and this finally becomes unbearable.  It reminds me of Paul Ricoeur’s suggestion that while in the  past learning the arts of memory was a prized skill, we may now need to cultivate an art of forgetting.

As we’ve noted before, however, others have pointed out that digital data is not nearly so resilient and durable as we are led to believe.  In fact, unfathomably large amounts of data are lost each year.  Paul-Choudhury cites Marc Weber of the Computer History Museum who observes that, “Digital records are more like an oral tradition than traditional documents, if you don’t copy them regularly, they simply disappear.” Web sites go dark, files are corrupted, servers fail — all of this does render our digital legacy a bit more fragile than we might imagine.

On the whole, however, it does seem as if most of us are generating much more by way of relatively permanent and accessible memories and remembrances than most of our forebears, except perhaps the most assiduous diarists, would have been capable of producing.  Of course, the contrast with the diarist may be precisely the issue at the heart of the debate between the preservationists and the deletionists.  The diarist filters and orders their memories, much of what we are storing is not quite so filtered and ordered.  Perhaps more importantly, the diary was private.  Our digital memories are mostly public.

On the one hand this enhances the memories.  For example, Paul-Choudhury notes that his photos on Flickr “are surrounded by a rich history of social interactions, from groups to which I belong to comments that friends have left about my photos.  That feels  just as much part of my digital soul as the images themselves.”  On the other hand, it is the public nature of the memories that renders them susceptible to malicious use by others and makes it more difficult to curate our online identity.

While it is intriguing to think about the digital legacy that will survive us, it may be more pressing to think about the significance of these issues for us while we remain in the land of the living.  Writing about the nineteenth century in Present Past, Richard Terdiman identified what he called a “crisis of memory” arising from the temporal dislocations associated with the French Revolution.  Memory or history no longer seemed a continuous flow into which modern people could easily place themselves.  Perhaps we may begin to think about a contemporary crisis of memory that arises not from our dislocation from tradition, but from our dislocation from our memories themselves.  Our memory may be at once, too present, and not present enough.  Now it is not only that we are unable to integrate our own lives with cultural memory, we may be having a hard time integrating our our own memories with our experience of life.

It is true that much of our memory has always to some degree been external to us, but sometimes degrees of difference amount to qualitative change.  We have been busily uploading our memories over the last several years creating an artificial memory of truly unprecedented scope.  The consequences of this exponential growth of memory are as of yet unclear, but it may be that we are creating for ourselves a year of cruel Aprils.

April is the cruelest month, breeding
Lilacs out of the dead land, mixing
Memory and desire, stirring
Dull roots with spring rains.

Studied Responses: Reactions to bin Laden’s Death

Image: CNN Belief Blog

In the moments, hours, and days following the announcement of Osama bin Laden’s death I was repeatedly struck by the amount of attention paid to the manner in which Americans were responding to his death.  Almost immediately I began to pick up notes of concerned introspection about the response (e.g., the jubilant crowds gathered at the White House and Ground Zero), and what ought to be the appropriate response.

This introspection appears to have been most pronounced within religious circles.  At Christianity Today, Sarah Pulliam Bailey gathered together Tweets from a number of evangelical Christian leaders and bloggers addressing the question, “How should Christians respond to Osama bin Laden’s death?”  A sizable comment thread formed below the post.  At the religion and media web site, Get Religion, in a post titled “Churches respond to Osama’s death,” we get another round of links to church leaders writing about the appropriate response to the killing of bin Laden.

The topic, however, was also prominent in the more mainstream media.  NPR, for example, ran a short piece titled “Is It Wrong to Celebrate Bin Laden’s Death” and another piece focused on bin Laden’s death titled “Is Celebrating Death Appropriate?”  In the former story we get the following odd piece of reflection:

Laura Cunningham, a 22-year-old Manhattan reveler — gripping a Budweiser in her hand and sitting atop the shoulders of a friend — was part of the crowd at ground zero in the wee hours Monday. As people around her chanted “U-S-A,” Cunningham was struck by the emotional response. She told New York Observer: “It’s weird to celebrate someone’s death. It’s not exactly what we’re here to celebrate, but it’s wonderful that people are happy.”

I say “odd,” because it is not clear that this young lady knew what or why she was celebrating.  “But it’s wonderful that people are happy”?  What?

The NY Times also ran a story titled, “Celebrating a Death: Ugly, Maybe, but Only Human.”  And, finally, in case you are interested, Noam Chomsky would also like you to know about his reaction to Osama’s death, although I imagine you can guess.  Additionally, at CNN’s Belief Blog, you can read “Survey:  Most Americans say its wrong to celebrate bin Laden’s death,” and Stephen Prothero’s reflections on the aforementioned survey.  You get the idea.

So all of this strikes me as rather interesting.  For one thing, I can’t really imagine this sort of self-awareness permeating the responses of previous generations to historical events of this sort.  Of course, this may be because this event is sui generis, although I doubt that is quite right.  It seems rather another instance of the self-reflexiveness and self-reference that has become a characteristic of our society.  I might push this further by noting that this post just adds another layer, another mirror, as I reflect on the reflections.  My usual explanation for this hypertrophied self-awareness is the collapse of taken-for-granted social structures and customs and the correlated rise of the liberated, spontaneous self.  The spontaneous self as it turns out is not that spontaneous; rather it is performed.  Performance is studied and aware of itself; conscious of its every response.  Naturally then, we are asking at the cultural level whether our “spontaneous” celebrations were appropriate.  Did we play this part right?

This posture seems to me to lack a certain degree of integrity, in the sense that our way of being in the world is not integrated; very little comes naturally, our actions all feel rather artificial.  Perhaps especially at those times when we most wish we could just be fully in the moment, we rather feel a certain anxiety about feeling the right way — are we feeling the way we are supposed to be feeling, etc.  However, the integrated self is also somewhat opaque to itself; it is capable of acting literally without thought, and thus perhaps thoughtlessly.

I’ll resist the temptation to provide a concluding paragraph that wraps things up neatly with a fresh insight.  More of an aspiration than a temptation, I suppose, if the insight just isn’t there.