Social Media and the Arts of Memory

UPDATE: A complete and updated form of the essay began here is now available here.

Early in The Art of Memory, Frances Yates pauses to envision a “forgotten social habit” of the classical world.  She invites us to wonder,“Who is that man moving slowly in the lonely building, stopping at intervals with an intent face?”  That man, Yates tells us, is a “rhetoric student forming a set of memory loci.”  The rhetoric student would have been creating the architecture of a mental space into which they would then place vivid images imaginatively designed to recollect the themes or words of a prepared oration.  While delivering the oration, the rhetor would navigate the mental space coming upon each carefully placed image which triggered their memory accordingly.  This work of designing mental space and populating the space with striking images followed the prescriptions of the artificial memory techniques widely practiced in classical antiquity.

What if, however, we updated Yates’ scene by setting it in the present?  The scene would be instantly recognizable as long as we equipped our imagined person with a cell phone.  The stopping at intervals and the intent face would correspond to any of the multiple uses to which an Internet-enabled smart phone may be put:  reading or sending a text message, downloading songs, taking or sending pictures and video, updating social media profiles, or finding directions with GPS, to name but a few.  What is striking is how often these activities would, like that of the ancient rhetor, involve the work of memory.   Much of what cell phones are increasingly used for has very little to do with making a phone call, after all. In fact, one could argue that the calling feature of phones is becoming largely irrelevant.  Cell phones are more likely to be used to access the Internet, send a text message, take a picture, or film a video.  Given these capabilities cell phones have become prosthetic memory devices; to lose a cell phone would be to induce a state of partial amnesia.  Or, it may be better to say it would might induce a fear of future amnesia since our ability to approach the present as a field of potential memories would be undermined.

Social networking sites (SNS) are of special interest because of the way in which they explicitly trade in memory.  As Joanne Garde-Hansen asks in “MyMemories?:  Personal Digital Archive Fever and Facebook,” “If memory and identity can be seen as coterminous and SNSs involve literally digitising ones’ self into being then what is at stake when memories and identities are practiced on Facebook?”

She goes on to add,

It goes without saying that the allure of the site is in its drawing together in one place memory practices: creating photograph albums, sharing photographs, messaging, joining groups and alumni memberships, making ‘friends’ and staying in touch with family.

It would be fair to acknowledge that SNS such as Facebook traffic in more than the allure of memory practices. Nonetheless, the production, maintenance, and retrieval of memory is integral to the practices deployed on SNSs.

Following Jacques Derrida, Garde-Hansen considers Facebook as an instance of the archiving archive. Thus, she points out, the architecture of a SNS such as Facebook is not neutral with respect to the memories it archives.  As Derrida observed,

… the technical structure of the archiving archive also determines the structure of the archivable content even in its very coming into existence and in its relationship to the future.  The archivization produces as much as it records the event.

Garde-Hansen also draws SNSs into the conflict between database and narrative staged by Manovich in The Language of New Media.  In her view, the most significant aspect of Manovich’s analysis of new media for SNSs is the comparison he draws between the visual organization of new media interfaces and spatial montage.  “Manovich’s emphasis upon spatial narrative is,” according to Garde-Hansen, “extremely pertinent to thinking through the emergence of SNSs and how these sites remediate personal and collective memory.” Framed in this way, memory as spatial montage challenges “the rise and dominance of history,” the “power of the written word” to order the past temporally, and the “twentieth century’s emphasis upon the single, unadulterated image (think cinema).”

Derrida’s insight suggests that given the way the architecture of an archive already determines what can in fact be archived, the future record of the past is already impinging upon the present.  Or, put otherwise, the sorts of memories we are able to generate with social media may already be directing our interactions in the present.  (For an insightful discussion of this point anchored on an analysis of faux-vintage photography see Nathan Jurgenson’s, “The Faux-Vintage Photo.”)  Drawing in Manovich’s database/narrative opposition, suggests that the visual/spatial mode of ordering memories on SNSs potentially shifts how meaning is derived from memory and how we construct the self.  We’ll explore both of these considerations a little further in subsequent posts.

Returning to the scene suggested by Yates, however, we may also consider SNSs such as Facebook as instances of new artificial memory spaces constructed to supplement and augment the natural memory.  Already in the artificial memory tradition we have memory rendered spatially and visually in a manner that anticipates the representation and organization of memory in SNSs.  Situating SNSs within the long history of spatial and visual memory also affords us the opportunity to consider SNSs in the context of a complex and rich tradition of philosophical reflection.

What emerges is a history of memory practices that alternate between a Platonic focus on memory as the presence of an absence and an Aristotelian emphasis on memory as the record of the past.  There are several thematic threads that weave this story together including the opposition of internal memory to memory supported by inscription, the connection between memory and imagination, memory as the index of desire, the related tensions between space and time and databases and narratives, and the relationship of memory to identity.  Yet for all the complexity those themes introduce, we will begin our next post in this series with a story.

___________________________________________________

Joanne Garde-Hansen’s article is found in Save As… Digital Memories.

Digital Legacies, Online Souls, and the Burdens of Remembering

Here is a story that came out of Melrose, Massachusetts yesterday:

According to the Boston Globe, no fewer than 11 Melrose varsity athletes were recently identified in illegal possession of alcohol or tobacco in photos which first surfaced on Facebook. The photos were taken from the site by a concerned parent, transferred to a thumb drive and submitted to the school’s administration as proof of inappropriate actions by the student body.

By now these stories are not too surprising; the only surprise may be in the fact that people are still being this careless with social media and incriminating or embarrassing images.  But then we realize that we are still learning to cope with what our media has enabled; to paraphrase W. H. Auden, we clearly cannot understand, what we can clearly do.  Writing in New Scientist, Sumit Paul-Choudhury recently observed,

Right now, though, we are living through a truly unique period in human history. We have learned how to create vast digital legacies but we don’t yet know how to tidy them up for our successors. The generations that went before us left no digital trace, and the ones that go after might leave nothing but sanitised “authorised biographies”. We will be defined through piecemeal and haphazard collections of our finest and foulest moments.

Paul-Choudhury’s article, “Digital legacy: The fate of your online soul,” explores how we are learning to manage the collection of digital data which becomes our online legacy, or what Hans-Peter Brondmo has evocatively called our “digital soul.”  Paul-Choudhury is, perhaps, more suited than most to write about this issue since six years ago he created a site to commemorate the life of his wife who passed away after a long battle with cancer.  Then it may have seemed a bit strange to wonder about one’s digital legacy, by now there are conferences and organizations devoted to the matter and plenty of  media attention as well.

In his article, Paul-Choudhury argues that we have not quite figured out how to manage the wealth of information that will survive us, and he identifies two principle approaches that have evolved in recent years:  the “preservationists” and the “deletionists.”  The names, I suspect, are self-explanatory.  Some argue that we should seek to save and preserve every last bit of data that we can, and others suggest that the more humane approach involves learning how to allow the possibility of forgetting.

The most prominent name on either side is probably Viktor Mayer-Schonberger (this post, incidentally, is apparently evolving a hyphenated name theme), the author of the 2009 book, Delete: The Virtue of Forgetting in the Digital Age which argued for the wisdom of creating tools that enabled the gradual erasure of online data over time.  He recommends that digital data be constructed so as to “digitally rust” to allow for a graceful forgetting that mimics the ordinary workings of human memory.  In his view, the problem is that we are now remembering too much and this finally becomes unbearable.  It reminds me of Paul Ricoeur’s suggestion that while in the  past learning the arts of memory was a prized skill, we may now need to cultivate an art of forgetting.

As we’ve noted before, however, others have pointed out that digital data is not nearly so resilient and durable as we are led to believe.  In fact, unfathomably large amounts of data are lost each year.  Paul-Choudhury cites Marc Weber of the Computer History Museum who observes that, “Digital records are more like an oral tradition than traditional documents, if you don’t copy them regularly, they simply disappear.” Web sites go dark, files are corrupted, servers fail — all of this does render our digital legacy a bit more fragile than we might imagine.

On the whole, however, it does seem as if most of us are generating much more by way of relatively permanent and accessible memories and remembrances than most of our forebears, except perhaps the most assiduous diarists, would have been capable of producing.  Of course, the contrast with the diarist may be precisely the issue at the heart of the debate between the preservationists and the deletionists.  The diarist filters and orders their memories, much of what we are storing is not quite so filtered and ordered.  Perhaps more importantly, the diary was private.  Our digital memories are mostly public.

On the one hand this enhances the memories.  For example, Paul-Choudhury notes that his photos on Flickr “are surrounded by a rich history of social interactions, from groups to which I belong to comments that friends have left about my photos.  That feels  just as much part of my digital soul as the images themselves.”  On the other hand, it is the public nature of the memories that renders them susceptible to malicious use by others and makes it more difficult to curate our online identity.

While it is intriguing to think about the digital legacy that will survive us, it may be more pressing to think about the significance of these issues for us while we remain in the land of the living.  Writing about the nineteenth century in Present Past, Richard Terdiman identified what he called a “crisis of memory” arising from the temporal dislocations associated with the French Revolution.  Memory or history no longer seemed a continuous flow into which modern people could easily place themselves.  Perhaps we may begin to think about a contemporary crisis of memory that arises not from our dislocation from tradition, but from our dislocation from our memories themselves.  Our memory may be at once, too present, and not present enough.  Now it is not only that we are unable to integrate our own lives with cultural memory, we may be having a hard time integrating our our own memories with our experience of life.

It is true that much of our memory has always to some degree been external to us, but sometimes degrees of difference amount to qualitative change.  We have been busily uploading our memories over the last several years creating an artificial memory of truly unprecedented scope.  The consequences of this exponential growth of memory are as of yet unclear, but it may be that we are creating for ourselves a year of cruel Aprils.

April is the cruelest month, breeding
Lilacs out of the dead land, mixing
Memory and desire, stirring
Dull roots with spring rains.

Twitter Time

“Twitter relies on people’s desire to be the same.” At least that’s what A. C. Goodall claims in a recent New Statesman article, “Is Twitter the Enemy of Self-Expression?”  This is, it would seem, a rather vague and unsubstantiated claim.  In his brief comments, Alan Jacobs writes that Goodall’s piece amounts to “assertions without evidence.”  Jacobs goes on to argue that it is unhelpful to make sweeping claims about something like Twitter which is “a platform and a medium,” rather than an organized, coherent unit with an integral “character.”  A medium or platform is subject to countless implementations by users, and, as the history of technology has shown, these uses are often surprising and unexpected.

On the whole I’m sympathetic to Jacobs comments.  His main point echoes Michel de Certeau’s insistence that we pay close attention to the use that consumers make of products.  In his time, the critical focus had fallen on the products and producers; consumers were tacitly assumed to be passive and docile recipients/victims of the powers of production.  De Certeau made it a point, especially in The Practice of Everyday Life, to throw light on the multifarious, and often impertinent, uses to which consumers put products.  Likewise, Jacobs is reminding us that generalizations about a medium can be misleading and unhelpful because users put any medium to widely disparate ends.

This is a fair point.  However (and if there weren’t a “however” I wouldn’t be writing this), I’m a bit of a recalcitrant McLuhanist and tend to think that the medium may have its influence regardless of the uses to which it is put.  And perhaps, I might better label myself an Aristotelian McLuhanist, which is to say that I’m tending toward localizing the impact of a medium in the realm of habit and inclination.  The use of a medium over time creates certain habits of mind and body.  These habits of mind and body together yield, in my own way of using this language, a habituated sensibility.  The difficulty this influence poses to critique is that, precisely because it is habituated, it tends to operate below the level of conscious awareness.

I don’t think the focus on use and the attention to the effects of a medium are necessarily mutually exclusive.  Habits after all are only formed through significant and repeated use.   Perhaps they are two axes of a grid on which the impact of technology may be plotted. In any case, it would help to provide an example.

Consider our experience of time.  It seems that the human experience of time, how we sense and process the passage of time, is not a fixed variable of human nature.  My sense is that we habituate ourselves to a certain experience of time and it is difficult to immediately adjust to another mode.  Consider those rare moments when we find ourselves having nothing to do.  How often do we then report that we were unable to just relax; we had the urge to do something, anything.  We were restless precisely at the moment when we could have taken a rest.   Or, at a wider scale, consider the various ways cultures approach time.  We tend to naturalize the Western habits of precise time keeping and partitioning until we enter another culture which operates by a very different set of attitudes toward time.  It would take something much longer than a blog post to explore this fully, but it would seem plausible that certain technologies — some, like the mechanical clock, very old — mold our experience of time.

Bernard Stiegler has commented along similar lines on the media environment and consequent experience of time fostered by television.  To begin with he notes, going back to the establishment of the first press agency in Paris in 1835 near a new telegraph, that the “value of information as commodity drops precipitously with time …”  He goes on to describe industrial time in the following context:

“…. an event becomes an event — it literally takes place — only in being ‘covered.’  Industrial time is always at least coproduced by the media.  ‘Coverage’ — what is to be covered — is determined by criteria oriented toward producing surplus value.  Mass broadcasting is a machine to produce ready-made ideas, ‘cliches.’  Information must be ‘fresh’ and this explains why the ideal for all news organs is the elimination of delay in transmission time.”

To be sure, more than the logic of the medium is at play here, but it may be difficult and beside the point to parse out the logic of the medium from other factors.

The ability to eliminate  of the delay between event and transmission that characterized industrial time has been radically democratized by digital media.  We are all operating under these conditions now.  You may vaguely remember, by contrast, the time that elapsed between snapping a picture, getting it developed, and finally showing it to others.  That time has been collapse, not only for large news organizations, but for anyone with an internet enabled smart phone.  In the interest of creating catchy labels, perhaps we may call this, not industrial time, but Twitter time.  “Twitter” here is just a synecdoche for the ability to immediately capture and broadcast information, an ability that is now widely available.  My guess is that this capacity, admittedly used in various ways, will affect the sensibility that we label our “experience of time.”

Stiegler continues (with my apologies for subjecting you to the rather dense prose):

“With an effect of the real (of presence) resulting from the coincidence of the event and its seizure and with the real-time or ‘live’ transmission resulting from the coincidence of the event and its reception, a new experience of time, collective as well as individual, emerges.  This new time betokens an exit from the properly historical epoch, insofar as the latter is defined by an essentially deferred time — that is, by a constitutive opposition, posited in principle, between the narrative and that which is narrated.  This is why Pierre Nora can claim that the speed of transmission of analog and digital transmissions promotes ‘the immediate to historical status’:

‘Landing on the moon was the model of the modern event.  Its condition remained live retransmission by Telstar . . . . What is proper to the modern event is that it implies an immediately public scene, always accompanied by the reporter-spectator, who sees events taking place.  This ‘voyeurism’ gives to current events both their specificity with regard to history and their already historical feel as immediately out of the past.’

There is a lot to unpack in all of that.  We are all reporter-spectators now.  Deferred time, time between event and narration, is eclipsed. Everything is immediately “out of the past,” or, at least as I understand it, the whole of the past is collapsed into a moment that is not now.  The earthquake and tsunami in Japan, just two months past, might as well have taken place five years ago.  The killing of bin Laden, likewise, will very soon appear to be buried in the indiscriminate past.

Twitter as a medium, used to the point of fostering a habituated sensibility (but regardless of particularized uses), would seem to accelerate this economy of time and expand its province into private life.  It doesn’t create this economy of time, but it does heighten and reinforce its trajectory.  In fact, the relentless flow of the Twitter “timeline” (not an insignificant designation), or better, our effort to keep up with it and make sense of it, may be an apt metaphor for our overall experience of time.

All of this to say that while a medium or platform can be used variously and flexibly, it is not infinitely malleable; a certain underlying logic is more or less fixed and this logic has its own consequences.  Of course, none of this necessarily amounts to saying Twitter is “bad”, only to note that its use can have consequences.

Speaking of habit, I’m curious if anyone felt the urge to click the “1 New Tweet” image?

 

Twitterfication: More Complicated + Less New = No Interest

Seismic Acitivty or Media Coverage

On the Media’s recent program,  Turning Away, focused on the spike in foreign news coverage following the devastation in Japan and the combat in Libya.  That spike, however, plateaued, and now foreign coverage in American journalism is again on the decline.  At least until the next crisis, of course.

This prompted some incisive, if somewhat disconcerting, observations from host Brooke Gladstone and her guests, Mark Jurkowitz and Steve Coll.  Here is Gladstone introducing the program:

BROOKE GLADSTONE: Mark Jurkowitz at the PEW Research Center’s Project for Excellence in Journalism says that a few weeks back Libya and Japan made up more than 40 percent of the news, an extraordinary number. But now, even as fresh horrors rain down on the people of Libya and Japan, the American media look elsewhere for leads.

Perhaps, says Jurkowitz, that’s because events out there have become both more complicated and less new, a lethal combination for coverage . . .

That last line struck me as being regrettably accurate.  More Complicated + Less New = Less Coverage.  And less coverage either reflects or engenders no interest.  I’m fairly certain that this equation has summed up the way American media works for some time time now; Kierkegaard had already diagnosed the symptoms in the 19th century.  But I would also speculate that the dynamics of digital/social media have also ratcheted up the logic the equation seeks to convey, exponentially perhaps.  Consider it the Twitterfication of the news cycle.  We can’t quite do complicated and sustained very well within the constraints of social media.

The following exchange also provided a helpful schema that rang true, the 12-day disaster editorial cycle:

BROOKE GLADSTONE: Steve Coll covered his fair share of natural disaster and war in his decades as foreign correspondent at The Washington Post, and he found that there is a template for many stories, no matter how harrowing. In his experience, earthquake and disaster coverage, in general, follow a 12-day editorial cycle. He witnessed it while covering an earthquake that killed tens of thousands of people in Iran.

The first few days are spent reporting breaking news and casualties and destruction. Around day five, the late miracle story in which search teams find an improbable survivor amidst the rubble. Day seven brings the interpretation of meaning story, with religious overtones. By day 12, it’s essentially buh-bye for now.

So in your mind run through the catastrophes and crisis that have garnered significant media coverage over the last year or so and see if that does not neatly capture the way they were covered.  Wait, having a hard time remembering the catastrophes and crisis of the last year?  Were you caught off guard, as I was, when we heard that it had been a year since the BP oil spill in the gulf?  Vaguely remember something about floods in Australia? Something happened in Tunisia recently right?  It seems the logic of our media environment is precisely calibrated to induce forgetfulness.

After Coll expresses some surprise at how quickly we have lost sight of ongoing developments in Japan and Libya, Gladstone asks Coll, “Should we be worried about that?”

Coll is, perhaps justifiably, sardonic in response:

STEVE COLL: Well, we are a global power with military and diplomatic interests and deployments all over the world, and we expend tax dollars and put lives at risk all the time in complicated foreign environments, so yeah, it’s a problem. We ought to be thinking about these places on an empirical basis in greater depth than we sometimes do.

“If they’re going to be rude, I’ll be rude right back”

“Once more unto the breach, dear friends, once more,” or so David Carr seemed to say.  In his latest NY Times op-ed, “Keep Your Thumbs Still When I’m Talking To You,” Carr rallies the troops once more to the cause of civility in a digital age.  The battle has been raging for some time and some might think it already lost, but Carr brings us rumors from deep within the enemy’s citadel suggesting that even there the tide may be turning.

Alright, so the martial metaphor may be a bit overdone, but that is more or less how I experienced Carr’s piece.  On more than a few occasions over the last several months, I’ve written on the same theme and advocated an approach to digital media which preserves the dignity of the human person, particularly the in-the-flesh human persons in front of us.  I’ve done so mainly by seconding the work of Jaron Lanier (here and here), Rochelle Gurstein, and Gary Shteyngart among others. But it has been awhile since I’ve addressed the matter, and I have to confess, there is a certain complacency that starts to set in.  One begins to question whether it is even worth the effort.  Perversely, one even begins to feel that it would be rude to point out the rudeness of those who will not give another human being their undivided attention or manage to take their calls out of public earshot.  Carr, however, gives the cause of civility one more public shot in the arm.

He begins his piece straightforwardly, “Add one more achievement to the digital revolution: It has made it fashionable to be rude”  Nothing new here, of course.  Carr’s anecdotal evidence is drawn from his recent experience at South by Southwest Interactive.  Here he witnesses all sorts of activities that would neatly fall into the category aptly described in the title of Sherry Turkle’s recent book, Alone Together.  It is the usual sort of thing, people distractedly gazing at their smart phones and tablets whether in the audience, waiting in line, or even participating on a panel.  In this particular piece, my favorite moment of bemused recognition came from Entertainment Weekly’s Anthony Breznican describing what happens after one person excuses themselves to check their phone at the dinner table,

“Instead of continuing with the conversation, we all take out our phones and check them in earnest,” he said. “For a few minutes everybody is typing away. A silence falls over the group and we all engage in a mass thumb-wrestling competition between man and little machine. Then the moment passes, the BlackBerrys and iPhones are reholstered, and we return to being humans again after a brief trance.”

Yet, there were also signs of awakening.  For example, in response to his own presentation, “I’m So Productive, I Never Get Anything Done,” Carr reports that,

The biggest reaction in the session by far came when Anthony De Rosa, a product manager and programmer at Reuters and a big presence on Twitter and Tumblr, said that mobile connectedness has eroded fundamental human courtesies.

“When people are out and they’re among other people they need to just put everything down,” he said. “It’s fine when you’re at home or at work when you’re distracted by things, but we need to give that respect to each other back.”

His words brought sudden and tumultuous applause. It was sort of a moment, given that we were sitting amid some of the most digitally devoted people in the hemisphere. Perhaps somewhere on the way to the merger of the online and offline world, we had all stepped across a line without knowing it.

Perhaps.

Lest we get too encouraged, Carr also tells of the earnest young man that came up to him after the talk to affirm the importance of “actual connection” while “casting sidelong glances at his iPhone while we talked.”  Carr is almost certainly right when he suggests that the young man didn’t even realize what he was doing.  The behavior is more or less habitual and thus just below the level of conscious awareness.

For some, however, the behavior is, in fact, quite conscious.  Carr mentions MG Siegler’s TechCrunch essay entitled “I Will Check My Phone at Dinner and You Will Deal With It.”  “This is the way the world works now,” Seigler brusquely informs us,  “We’re always connected and always on call. And some of us prefer it that way.”   Those are fighting words. They are also words that almost invoke a wearied resignation on the part of those who, in fact, don’t prefer it that way.  This is the force of the rhetoric of inevitability:  hear it and repeat it often enough and you start believing it.

Moreover, Carr notes that

… there is also a specific kind of narcissism that the social Web engenders.  By grooming and updating your various avatars, you are making sure you remain at the popular kid’s table. One of the more seductive data points in real-time media is what people think of you. The metrics of followers and retweets beget a kind of always-on day trading in the unstable currency of the self.

That is nicely crafted and incisive metaphor right at the end, and, to borrow a line from David Foster Wallace, it all amounts to getting fed “food pellets of praise.”  I would go so far as to speculate that the issue is neurological and I’m sure someone out there can refer me to a study that suggests a link between our social media interactions and the release of dopamine in the brain, or something along those lines.  (Here’s a start: “All those tweets, apps, updates may drain brain.”)

In any case, the lines are drawn once more.  The martial metaphors are in a sense already suggested by Carr in his closing lines, drawing on the observations of William Powers, he writes

And therein lies the real problem. When someone you are trying to talk to ends up getting busy on a phone, the most natural response is not to scold, but to emulate. It’s mutually assured distraction.

Mutually assured distraction of course alludes to the Cold War-era doctrine of mutually assured destruction.  There may be more overlap between the two than even Carr intended, both are a form of madness in their own way.  And perhaps it is time for more aggressive tactics.  De Rosa, cited above, also wrote Carr the following:

I’m fine with people stepping aside to check something, but when I’m standing in front of someone and in the middle of my conversation they whip out their phone, I’ll just stop talking to them and walk away. If they’re going to be rude, I’ll be rude right back.

“Once more unto the breach …”