Social Media and the Arts of Memory

UPDATE: A complete and updated form of the essay began here is now available here.

Early in The Art of Memory, Frances Yates pauses to envision a “forgotten social habit” of the classical world.  She invites us to wonder,“Who is that man moving slowly in the lonely building, stopping at intervals with an intent face?”  That man, Yates tells us, is a “rhetoric student forming a set of memory loci.”  The rhetoric student would have been creating the architecture of a mental space into which they would then place vivid images imaginatively designed to recollect the themes or words of a prepared oration.  While delivering the oration, the rhetor would navigate the mental space coming upon each carefully placed image which triggered their memory accordingly.  This work of designing mental space and populating the space with striking images followed the prescriptions of the artificial memory techniques widely practiced in classical antiquity.

What if, however, we updated Yates’ scene by setting it in the present?  The scene would be instantly recognizable as long as we equipped our imagined person with a cell phone.  The stopping at intervals and the intent face would correspond to any of the multiple uses to which an Internet-enabled smart phone may be put:  reading or sending a text message, downloading songs, taking or sending pictures and video, updating social media profiles, or finding directions with GPS, to name but a few.  What is striking is how often these activities would, like that of the ancient rhetor, involve the work of memory.   Much of what cell phones are increasingly used for has very little to do with making a phone call, after all. In fact, one could argue that the calling feature of phones is becoming largely irrelevant.  Cell phones are more likely to be used to access the Internet, send a text message, take a picture, or film a video.  Given these capabilities cell phones have become prosthetic memory devices; to lose a cell phone would be to induce a state of partial amnesia.  Or, it may be better to say it would might induce a fear of future amnesia since our ability to approach the present as a field of potential memories would be undermined.

Social networking sites (SNS) are of special interest because of the way in which they explicitly trade in memory.  As Joanne Garde-Hansen asks in “MyMemories?:  Personal Digital Archive Fever and Facebook,” “If memory and identity can be seen as coterminous and SNSs involve literally digitising ones’ self into being then what is at stake when memories and identities are practiced on Facebook?”

She goes on to add,

It goes without saying that the allure of the site is in its drawing together in one place memory practices: creating photograph albums, sharing photographs, messaging, joining groups and alumni memberships, making ‘friends’ and staying in touch with family.

It would be fair to acknowledge that SNS such as Facebook traffic in more than the allure of memory practices. Nonetheless, the production, maintenance, and retrieval of memory is integral to the practices deployed on SNSs.

Following Jacques Derrida, Garde-Hansen considers Facebook as an instance of the archiving archive. Thus, she points out, the architecture of a SNS such as Facebook is not neutral with respect to the memories it archives.  As Derrida observed,

… the technical structure of the archiving archive also determines the structure of the archivable content even in its very coming into existence and in its relationship to the future.  The archivization produces as much as it records the event.

Garde-Hansen also draws SNSs into the conflict between database and narrative staged by Manovich in The Language of New Media.  In her view, the most significant aspect of Manovich’s analysis of new media for SNSs is the comparison he draws between the visual organization of new media interfaces and spatial montage.  “Manovich’s emphasis upon spatial narrative is,” according to Garde-Hansen, “extremely pertinent to thinking through the emergence of SNSs and how these sites remediate personal and collective memory.” Framed in this way, memory as spatial montage challenges “the rise and dominance of history,” the “power of the written word” to order the past temporally, and the “twentieth century’s emphasis upon the single, unadulterated image (think cinema).”

Derrida’s insight suggests that given the way the architecture of an archive already determines what can in fact be archived, the future record of the past is already impinging upon the present.  Or, put otherwise, the sorts of memories we are able to generate with social media may already be directing our interactions in the present.  (For an insightful discussion of this point anchored on an analysis of faux-vintage photography see Nathan Jurgenson’s, “The Faux-Vintage Photo.”)  Drawing in Manovich’s database/narrative opposition, suggests that the visual/spatial mode of ordering memories on SNSs potentially shifts how meaning is derived from memory and how we construct the self.  We’ll explore both of these considerations a little further in subsequent posts.

Returning to the scene suggested by Yates, however, we may also consider SNSs such as Facebook as instances of new artificial memory spaces constructed to supplement and augment the natural memory.  Already in the artificial memory tradition we have memory rendered spatially and visually in a manner that anticipates the representation and organization of memory in SNSs.  Situating SNSs within the long history of spatial and visual memory also affords us the opportunity to consider SNSs in the context of a complex and rich tradition of philosophical reflection.

What emerges is a history of memory practices that alternate between a Platonic focus on memory as the presence of an absence and an Aristotelian emphasis on memory as the record of the past.  There are several thematic threads that weave this story together including the opposition of internal memory to memory supported by inscription, the connection between memory and imagination, memory as the index of desire, the related tensions between space and time and databases and narratives, and the relationship of memory to identity.  Yet for all the complexity those themes introduce, we will begin our next post in this series with a story.

___________________________________________________

Joanne Garde-Hansen’s article is found in Save As… Digital Memories.

Resisting Disposable Reality

Technology and consumerism coalesce to create disposable reality.  Let’s try that idea on for a moment by drawing together observations made about each by Albert Borgmann and William Cavanaugh respectively.

Writing about technological culture, Borgmann distinguished between devices characterized by a “commodious,” accessible surface and a hidden, opaque machinery below the surface on the one hand and what he calls focal things on the other.  Devices are in turn coupled with consumption and focal things are paired with focal practices.  Focal things and practices, according to Borgmann, “gather our world and radiate significance in ways that contrast with the diversion and distraction afforded by commodities.”  In short, we merely use devices while we engage with focal things.

With those distinctions in mind, Borgmann continues, “Generally, a focal thing is concrete and of commanding presence.”   A commanding presence or reality is later opposed to “a pliable or disposable reality.”  Further on still, Borgmann writes, “Material culture in the advanced industrial democracies spans a spectrum from commanding to disposable reality.  The former reality calls forth a life of engagement that is oriented within the physical and social world.  The latter induces a life of distraction that is isolated from the environment and from other people.”  On that last point, bear in mind that Borgmann is writing in the early 2000s before the onset of social media.  (Although, it is debatable whether or not his point still stands.)

Borgmann then addresses his analysis to human desire by noting that:

To the dissolution of commanding reality corresponds on the human side a peculiar restlessness.  Since every item of cyberpresence can be x-rayed, zoomed into, overlayed, and abandoned for another more promising site, human desire is at every point at once satiated, disappointed, and aroused to be once more gorged, left hungry, and spurred on.

Writing about contemporary consumerism, William T. Cavanaugh observes, “What really characterizes consumer culture is not attachment to things but detachment.  People do not hoard money; they spend it.  People do not cling to things; they discard them and buy other things.”  Furthermore, Cavanaugh adds, “Consumerism is not so much about having more as it is about having something else; that’s why it is not simply buying but shopping that is the heart of consumerism.  Buying brings a temporary halt to the restlessness that typifies consumerism.”

Both Borgmann and Cavanaugh have identified an analogous pattern at the heart of both contemporary technology and the consumerist spirit:  both render reality essentially disposable.  Both also note how this disposable quality yields a restlessness or unsettledness that permeates our experience.  This experience of reality as essentially disposable and its attendant restlessness are characteristic of what sociologist Zygmunt Bauman has termed, “liquid modernity.”

Interestingly, one of the focal things identified by Borgmann is the book with its corresponding focal practice, reading.  While Cavanaugh did not make this observation, it seems to me that the book as object is one of the few commodities that resists his analysis of contemporary consumerism.  That is to say that books tend to be purchased and kept.  There are exceptions, of course.  Many books turn out not to be worth keeping.  We trade some books at used books stores for others.  We also now sometimes sell certain books through services provided by Amazon.com and the like.  Nonetheless, I would venture to say that those who purchase books often do so with an eye to keeping them.  Where we would typically encounter detachment, with the book we find a measure of attachment.  In a sea of technological, consumerist flux the book is a fixed point. It is an object that is engaged and not merely used, it is possessed rather than readily disposed; and perhaps, in modest measure, it tacitly alleviates our restlessness.

Perhaps this then provides one angle of approach to the analysis of electronic books and e-readers.  Consider Matt Henderson’s recent observations regarding his children’s experience of “reading” Al Gore’s Our Choice, “Push Pop Press’s highly-anticipated first interactive book.”  Henderson introduced Our Choice to his two children whom he describes as technologically savvy readers.

I showed them Our Choice, and just observed. They quickly figured out the navigation, and discovered all the interactive features. But… they didn’t read the content. Fascinated, they skipped through the book, hunting for the next interactive element, to see how it works. They didn’t completely watch a single video.

When they finished, I asked them to tell me about the book. They described how they could blow on the screen and see the windmill turn, how they could run their fingers across the interactive map and see colors changing. How they could pinch to open and close images. But they couldn’t recall much of what the book was about. They couldn’t recall the message intended to be communicated in any of the info-graphics (though they could recall, in detail, how they worked.)

Run through Borgmann’s grid this seems to be an instance of contrast between a focal thing with its attendant practice and a device  with its attendant consumption. The Kindle comes off better in Henderson’s analysis, and in his children’s experience, and this makes sense since the Kindle’s interface lends itself more readily to focused engagement.  And yet, the Kindle fails to provide the physical presence of books we keep which seems to be not insignificant as we search for anchors in an environment of manufactured restlessness and disposable realities.  To borrow a line from T. S. Eliot, nostalgia for the book in this case is just our pursuit of a “still point of the turning world.”

____________________________________________________________

Borgmann’s comments are drawn from Power Failure.

Cavanaugh’s comments are drawn from Being Consumed.

Henderson’s comments via Alan Jacobs.

Digital Legacies, Online Souls, and the Burdens of Remembering

Here is a story that came out of Melrose, Massachusetts yesterday:

According to the Boston Globe, no fewer than 11 Melrose varsity athletes were recently identified in illegal possession of alcohol or tobacco in photos which first surfaced on Facebook. The photos were taken from the site by a concerned parent, transferred to a thumb drive and submitted to the school’s administration as proof of inappropriate actions by the student body.

By now these stories are not too surprising; the only surprise may be in the fact that people are still being this careless with social media and incriminating or embarrassing images.  But then we realize that we are still learning to cope with what our media has enabled; to paraphrase W. H. Auden, we clearly cannot understand, what we can clearly do.  Writing in New Scientist, Sumit Paul-Choudhury recently observed,

Right now, though, we are living through a truly unique period in human history. We have learned how to create vast digital legacies but we don’t yet know how to tidy them up for our successors. The generations that went before us left no digital trace, and the ones that go after might leave nothing but sanitised “authorised biographies”. We will be defined through piecemeal and haphazard collections of our finest and foulest moments.

Paul-Choudhury’s article, “Digital legacy: The fate of your online soul,” explores how we are learning to manage the collection of digital data which becomes our online legacy, or what Hans-Peter Brondmo has evocatively called our “digital soul.”  Paul-Choudhury is, perhaps, more suited than most to write about this issue since six years ago he created a site to commemorate the life of his wife who passed away after a long battle with cancer.  Then it may have seemed a bit strange to wonder about one’s digital legacy, by now there are conferences and organizations devoted to the matter and plenty of  media attention as well.

In his article, Paul-Choudhury argues that we have not quite figured out how to manage the wealth of information that will survive us, and he identifies two principle approaches that have evolved in recent years:  the “preservationists” and the “deletionists.”  The names, I suspect, are self-explanatory.  Some argue that we should seek to save and preserve every last bit of data that we can, and others suggest that the more humane approach involves learning how to allow the possibility of forgetting.

The most prominent name on either side is probably Viktor Mayer-Schonberger (this post, incidentally, is apparently evolving a hyphenated name theme), the author of the 2009 book, Delete: The Virtue of Forgetting in the Digital Age which argued for the wisdom of creating tools that enabled the gradual erasure of online data over time.  He recommends that digital data be constructed so as to “digitally rust” to allow for a graceful forgetting that mimics the ordinary workings of human memory.  In his view, the problem is that we are now remembering too much and this finally becomes unbearable.  It reminds me of Paul Ricoeur’s suggestion that while in the  past learning the arts of memory was a prized skill, we may now need to cultivate an art of forgetting.

As we’ve noted before, however, others have pointed out that digital data is not nearly so resilient and durable as we are led to believe.  In fact, unfathomably large amounts of data are lost each year.  Paul-Choudhury cites Marc Weber of the Computer History Museum who observes that, “Digital records are more like an oral tradition than traditional documents, if you don’t copy them regularly, they simply disappear.” Web sites go dark, files are corrupted, servers fail — all of this does render our digital legacy a bit more fragile than we might imagine.

On the whole, however, it does seem as if most of us are generating much more by way of relatively permanent and accessible memories and remembrances than most of our forebears, except perhaps the most assiduous diarists, would have been capable of producing.  Of course, the contrast with the diarist may be precisely the issue at the heart of the debate between the preservationists and the deletionists.  The diarist filters and orders their memories, much of what we are storing is not quite so filtered and ordered.  Perhaps more importantly, the diary was private.  Our digital memories are mostly public.

On the one hand this enhances the memories.  For example, Paul-Choudhury notes that his photos on Flickr “are surrounded by a rich history of social interactions, from groups to which I belong to comments that friends have left about my photos.  That feels  just as much part of my digital soul as the images themselves.”  On the other hand, it is the public nature of the memories that renders them susceptible to malicious use by others and makes it more difficult to curate our online identity.

While it is intriguing to think about the digital legacy that will survive us, it may be more pressing to think about the significance of these issues for us while we remain in the land of the living.  Writing about the nineteenth century in Present Past, Richard Terdiman identified what he called a “crisis of memory” arising from the temporal dislocations associated with the French Revolution.  Memory or history no longer seemed a continuous flow into which modern people could easily place themselves.  Perhaps we may begin to think about a contemporary crisis of memory that arises not from our dislocation from tradition, but from our dislocation from our memories themselves.  Our memory may be at once, too present, and not present enough.  Now it is not only that we are unable to integrate our own lives with cultural memory, we may be having a hard time integrating our our own memories with our experience of life.

It is true that much of our memory has always to some degree been external to us, but sometimes degrees of difference amount to qualitative change.  We have been busily uploading our memories over the last several years creating an artificial memory of truly unprecedented scope.  The consequences of this exponential growth of memory are as of yet unclear, but it may be that we are creating for ourselves a year of cruel Aprils.

April is the cruelest month, breeding
Lilacs out of the dead land, mixing
Memory and desire, stirring
Dull roots with spring rains.

Twitter Time

“Twitter relies on people’s desire to be the same.” At least that’s what A. C. Goodall claims in a recent New Statesman article, “Is Twitter the Enemy of Self-Expression?”  This is, it would seem, a rather vague and unsubstantiated claim.  In his brief comments, Alan Jacobs writes that Goodall’s piece amounts to “assertions without evidence.”  Jacobs goes on to argue that it is unhelpful to make sweeping claims about something like Twitter which is “a platform and a medium,” rather than an organized, coherent unit with an integral “character.”  A medium or platform is subject to countless implementations by users, and, as the history of technology has shown, these uses are often surprising and unexpected.

On the whole I’m sympathetic to Jacobs comments.  His main point echoes Michel de Certeau’s insistence that we pay close attention to the use that consumers make of products.  In his time, the critical focus had fallen on the products and producers; consumers were tacitly assumed to be passive and docile recipients/victims of the powers of production.  De Certeau made it a point, especially in The Practice of Everyday Life, to throw light on the multifarious, and often impertinent, uses to which consumers put products.  Likewise, Jacobs is reminding us that generalizations about a medium can be misleading and unhelpful because users put any medium to widely disparate ends.

This is a fair point.  However (and if there weren’t a “however” I wouldn’t be writing this), I’m a bit of a recalcitrant McLuhanist and tend to think that the medium may have its influence regardless of the uses to which it is put.  And perhaps, I might better label myself an Aristotelian McLuhanist, which is to say that I’m tending toward localizing the impact of a medium in the realm of habit and inclination.  The use of a medium over time creates certain habits of mind and body.  These habits of mind and body together yield, in my own way of using this language, a habituated sensibility.  The difficulty this influence poses to critique is that, precisely because it is habituated, it tends to operate below the level of conscious awareness.

I don’t think the focus on use and the attention to the effects of a medium are necessarily mutually exclusive.  Habits after all are only formed through significant and repeated use.   Perhaps they are two axes of a grid on which the impact of technology may be plotted. In any case, it would help to provide an example.

Consider our experience of time.  It seems that the human experience of time, how we sense and process the passage of time, is not a fixed variable of human nature.  My sense is that we habituate ourselves to a certain experience of time and it is difficult to immediately adjust to another mode.  Consider those rare moments when we find ourselves having nothing to do.  How often do we then report that we were unable to just relax; we had the urge to do something, anything.  We were restless precisely at the moment when we could have taken a rest.   Or, at a wider scale, consider the various ways cultures approach time.  We tend to naturalize the Western habits of precise time keeping and partitioning until we enter another culture which operates by a very different set of attitudes toward time.  It would take something much longer than a blog post to explore this fully, but it would seem plausible that certain technologies — some, like the mechanical clock, very old — mold our experience of time.

Bernard Stiegler has commented along similar lines on the media environment and consequent experience of time fostered by television.  To begin with he notes, going back to the establishment of the first press agency in Paris in 1835 near a new telegraph, that the “value of information as commodity drops precipitously with time …”  He goes on to describe industrial time in the following context:

“…. an event becomes an event — it literally takes place — only in being ‘covered.’  Industrial time is always at least coproduced by the media.  ‘Coverage’ — what is to be covered — is determined by criteria oriented toward producing surplus value.  Mass broadcasting is a machine to produce ready-made ideas, ‘cliches.’  Information must be ‘fresh’ and this explains why the ideal for all news organs is the elimination of delay in transmission time.”

To be sure, more than the logic of the medium is at play here, but it may be difficult and beside the point to parse out the logic of the medium from other factors.

The ability to eliminate  of the delay between event and transmission that characterized industrial time has been radically democratized by digital media.  We are all operating under these conditions now.  You may vaguely remember, by contrast, the time that elapsed between snapping a picture, getting it developed, and finally showing it to others.  That time has been collapse, not only for large news organizations, but for anyone with an internet enabled smart phone.  In the interest of creating catchy labels, perhaps we may call this, not industrial time, but Twitter time.  “Twitter” here is just a synecdoche for the ability to immediately capture and broadcast information, an ability that is now widely available.  My guess is that this capacity, admittedly used in various ways, will affect the sensibility that we label our “experience of time.”

Stiegler continues (with my apologies for subjecting you to the rather dense prose):

“With an effect of the real (of presence) resulting from the coincidence of the event and its seizure and with the real-time or ‘live’ transmission resulting from the coincidence of the event and its reception, a new experience of time, collective as well as individual, emerges.  This new time betokens an exit from the properly historical epoch, insofar as the latter is defined by an essentially deferred time — that is, by a constitutive opposition, posited in principle, between the narrative and that which is narrated.  This is why Pierre Nora can claim that the speed of transmission of analog and digital transmissions promotes ‘the immediate to historical status’:

‘Landing on the moon was the model of the modern event.  Its condition remained live retransmission by Telstar . . . . What is proper to the modern event is that it implies an immediately public scene, always accompanied by the reporter-spectator, who sees events taking place.  This ‘voyeurism’ gives to current events both their specificity with regard to history and their already historical feel as immediately out of the past.’

There is a lot to unpack in all of that.  We are all reporter-spectators now.  Deferred time, time between event and narration, is eclipsed. Everything is immediately “out of the past,” or, at least as I understand it, the whole of the past is collapsed into a moment that is not now.  The earthquake and tsunami in Japan, just two months past, might as well have taken place five years ago.  The killing of bin Laden, likewise, will very soon appear to be buried in the indiscriminate past.

Twitter as a medium, used to the point of fostering a habituated sensibility (but regardless of particularized uses), would seem to accelerate this economy of time and expand its province into private life.  It doesn’t create this economy of time, but it does heighten and reinforce its trajectory.  In fact, the relentless flow of the Twitter “timeline” (not an insignificant designation), or better, our effort to keep up with it and make sense of it, may be an apt metaphor for our overall experience of time.

All of this to say that while a medium or platform can be used variously and flexibly, it is not infinitely malleable; a certain underlying logic is more or less fixed and this logic has its own consequences.  Of course, none of this necessarily amounts to saying Twitter is “bad”, only to note that its use can have consequences.

Speaking of habit, I’m curious if anyone felt the urge to click the “1 New Tweet” image?

 

How Not to Study the Internet; the Quantified Self; and Digitally Facilitated Quality Time

Posting has been light of late, but will likely pick up in the coming weeks.  In the meantime, here are some items that have crossed my screen recently that might be of interest to regulars.

First, Alex Williams exams digitally enhanced family time in his NY Times piece, “Quality Time, Redefined.”  Williams explores the digitally gathered family which inhabits the same physical space while exploring individual digital spaces accessed via laptops, tablets, and smart phones.  Physically present to one another, family members are dispersed in their own activities, occasionally sharing something of interest.  According to James Gleick, who is cited by Williams,

In the near future, he said jokingly, “A new skill that will be taught by relationship counselors will be knowing when and how to interrupt one’s loved ones: Is a particular joke you’ve just read on Twitter worth her yanking out her earbuds?”

Despite the obligatory gesture toward “the critics,” Williams is sanguine.  This arrangement beats the passive absorption of television and the forced and artificial feel of planned quality time around games or meals.  It could be just me, but I can’t help but sense that such reassurances are like a tasty tonic into which a tasteless poison has been surreptitiously slipped.

At Crooked Timber, you can find an interesting post, “Against Studying the Internet,” about what is being studied, or what should be studied, when one studies “the Internet.”  Rather than focusing on the immensely large and notoriously amorphous thing we call the Internet, or even more specific things like social networking platforms, it is recommended that the object of study should be the role of causal mechanisms associated with specific technologies.  What does this mean? Read the post and, if this sort of thing interests you, the long, but substantive, comment thread.

Finally, Gary Wolf interviewed on the Quantified Self.  Wolf and Kevin Kelly, along with others, have been working on a project to make the immense amount of data collected about you in a digital environment work for you.  According to Wolf, “[Y]our data is not for your boss, your teacher, or your doctor — it’s for you.”  Sounds good.  Most obvious applications are, of course, for health care.  Other potential applications?

  • Facial tracking to improve happiness.
  • Cognition tracking to evaluate effects of simple dietary changes on brain function.
  • Food composition tracking to determine ideal protein/carb meal ratios for athletic performance.
  • Concentration tracking to determine effects of coffee on productivity.
  • Proximity tracking to assist in evaluation of mood swings.
  • Mapping of asthma incidents and correlation with humidity, pollen count, and temperature.
  • Energy use tracking to find opportunities for savings.
  • Gas mileage tracking to figure out if driving style matters.

Sounds less good somehow, but in that difficult to articulate way I tried to put some words to in my last post.