The Fog of Life: “Google,” Memory, and Thought

Last week a study suggesting that the ability to Google information is making it less likely that we commit information to memory garnered a decent amount of attention and discussion, including a few of my own thoughts in my last post.  In addition to writing a post on the topic I did something I almost never do for the sake of sanity, I followed the comment thread on a few websites that had posted articles on the story.  That was an instructive experience and has led to a few observations, comments, and questions which I’ll list briefly.

  • Google functions as a synecdoche for the Internet in way that no other company does. So when questions like “Is Google Making Us Stupid?” or “Is Google Ruining Our Memory?” are posed, what is really meant is more like “Is the Internet Making Us Stupid?”, etc.
  • Of course, “Google” is not an autonomous agent, but it has generated and made plausible a certain rhetoric that rather imprudently dismisses the need to remember.
  • People still get agitated by claims that the Internet is either bad or good for you.  Stories are framed in this way in the media, and discussion assumes this binary form.  Not much by way of nuance.
  • On the specific question of memory in relation to this study and the subsequent discussion, it is never quite clear what sort of memory is in view, although it appears that memory for facts or some variation of that is what most people are assuming in their comments.
  • The computer model of the brain is alive and well in people’s imagination.  How else could we explain the recurring claim that by offloading our memory to “Google” we are “freeing up space” in our memory so that our “processing” runs more efficiently.
  • Does anyone really believe that we, members of present society, are generally in danger of reaching the limits of our capacity for memory?
  • There is a concern that tying up memory on the retention of “trivial” facts will hamper our ability to perform higher order tasks such as critical and creative thinking.
  • “Trivial” is relative.  Phone numbers are often given as an example, but while knowing some obscure detail about the human cardio-vascular system might be “trivial” to me, it wouldn’t be so to a cardiovascular surgeon in the midst of an operation.
  • Why are we opposing two forms of knowledge or “intelligence” anyway?  Aren’t most of the people who are able to think critically and creatively about a topic or discipline the same people who have attained a mastery of the details of that same topic or discipline? Isn’t remembering the foundation of knowing, or are not the two at least intimately related?
  • Realizing that total recall of all pertinent facts in most cases is too high a bar, wouldn’t it at least be helpful not to rhetorically oppose facts to thinking?
  • The denigration of memory for facts seems — be warned this is impressionistic — aligned with a slide toward an overarching cloud of vagueness settling over our experience.  Not simply the vagueness by comparison with print disciplined speech that accompanies a return to orality, but a vagueness, distractedness, or inattentiveness  about immediate experience in general.
  • Will we know nothing in particular because we know where to find everything in general?

On that last note, consider Elizabeth Spires’ poem, “A Memory of the Future,” published in The Atlantic, and make of it what you will:

I will say tree, not pine tree.
I will say flower, not forsythia.
I will see birds, many birds,
flying in four directions.

Then rock and cloud will be
lost. Spring will be lost.
And, most terribly,
your name will be lost.

I will revel in a world
no longer particular.
A world made vague,
as if by fog. But not fog.

Vaguely aware,
I will wander at will.
I will wade deeper
into wide water.

You’ll see me, there,
out by the horizon,
an old gray thing,
who finally knows

gray is the most beautiful color.

Offloaded Memory and Its Discontents (or, Why Life Isn’t a Game of Jeopardy)

It is not surprising to learn that we are taking the time to remember less and less given the ubiquitous presence of the Internet and the consequent ability to “Google it” when you need it whatever “it” happens to be.  A new study in the journal Science affirms what most of us have already witnessed or experienced.  According to one report on the study:

Columbia University psychologist Betsy Sparrow and her colleagues conducted a series of experiments, in which they found that people are less likely to remember information when they are aware of its availability on online search engines …

“Our brains rely on the Internet for memory in much the same way they rely on the memory of a friend, family member or co-worker. We remember less through knowing information itself than by knowing where the information can be found,” said Sparrow.

If you sense that something of significance is lost in the transition from internal memory to prosthetic memory, you may also find that it takes a little work to pin point and articulate that loss.  Nicholas Carr, commenting on the same study, also has reservations and he puts them this way:

If a fact stored externally were the same as a memory of that fact stored in our mind, then the loss of internal memory wouldn’t much matter. But external storage and biological memory are not the same thing. When we form, or “consolidate,” a personal memory, we also form associations between that memory and other memories that are unique to ourselves and also indispensable to the development of deep, conceptual knowledge. The associations, moreover, continue to change with time, as we learn more and experience more. As Emerson understood, the essence of personal memory is not the discrete facts or experiences we store in our mind but “the cohesion” which ties all those facts and experiences together. What is the self but the unique pattern of that cohesion?

I’ve articulated some questions about off-loaded memory and the formation of identity in the past few months as well, along with thoughts on the relationship between internal memory and creativity.  Here I want to draw on an observation by Owen Barfield in Saving the Appearances.  Barfield is writing about perception when he notes:

I do not perceive any thing with my sense-organs alone, but with a great part of my whole human being. Thus, I may say, loosely, that I ‘hear a thrush singing’. But in strict truth all that I ever merely ‘hear’ — all that I ever hear simply by virtue of having ears — is sound. When I ‘hear a thrush singing’, I am hearing, not with my ears alone, but with all sorts of other things like  mental habits, memory, imagination, feeling and (to the extent at least that the act of attention involves it) will.  Of a man who merely heard in the first sense, it could meaningfully be said that ‘having ears’ (i. e. not being deaf) ‘he heard not’.

Barfield reminds us that our perception of reality is never merely a function of our senses.  Our perception, which in some respects is to say our interpretation of reality into meaningful experience, is grounded in, among other things, our memory, and certainly not our offloaded memory.  Offloaded memory is not ready-to-hand for our minds to use in its remarkable work of meaning-making.  Perception in this sense is impoverished by our willingness to offload what we might otherwise have internalized.

All of this leads me to ask, What assumptions are at play that make it immediately plausible for so many to believe that we can move from internalized memory to externalized memory without remainder?  It would seem, at least, that the ground was prepared by an earlier reduction of knowledge to information or data.  Only when we view knowledge as the mere aggregation of discreet bits of data, can we then believe that it makes little difference whether that data is stored in the mind or in a database.

We seem to be approaching knowledge as if life were a game of Jeopardy which is played well by merely being able to access trivial knowledge at random.  What is lost is the associational dimension of knowledge which constructs meaning and understanding by relating one thing to another and not merely by aggregating data.  This form of knowledge, which we might call metaphorical or analogical, allows us to experience life with the ability to “understand in light of”, to perceive through a rich store of knowledge and experience that allows us to see and make connections which richly texture and layer our experience of reality.

Augustine famously described memory as a vast storehouse in which there are treasures innumerable. It is our experience of life that is enriched by drawing on the treasures deposited into the storehouse of memory.

____________________________________________________

Update:  Also see “The Extended Mind — How Google Affects Our Memories” and Johah Lehrer’s post on the study.

Place and Image, Death and Memory

The sociability of social networking sites such as Facebook is built upon an archive of memories.  Facebook trades in memory in at least two ways.  On the one hand, and perhaps especially for older users, Facebook is platform that facilitates the search for memories.  Old friends and old flames can be found on Facebook.  Reconnecting with a high school buddy activates a surge of interconnected memories that lead to other long forgotten memories and so on.  On the other hand, and this perhaps especially for younger users, Facebook also renders present experience already a depository of potential memories.  The future past impinges upon the present.  Our experience is a conducted as a search for memories yet to be formed which will be archived on Facebook.  In a sense then we hunt for memories past and, paradoxically, for memories future.

This post is the second in a series situating Facebook within the memory theater/arts of memory tradition.  In the first post I set the stage by describing how social networking sites and internet enabled smart phones have constituted experience as a field of potential memories.  I also suggested that how we store and access our memories makes a difference.  The cultural and personal significance of memory is not a static category of human nature. Memory and its significance evolve over time, often in response to changing technologies.  So the question, then, is something like this:  What difference does it make, personally and culturally, that Facebook has become such a prominent mode of memory? 

In order to explore that question, I’ll delve briefly into the history of the art of memory, a set of memory practices with which, I believe, Facebook shares interesting similarities.  But as promised at the end of the last post, we’ll start with a story.

Spatiality, images, and death have long been woven together in the complex history of remembering.  Each appears prominently in the founding myth of what Frances Yates has called the “art of memory” as recounted by Cicero in his De oratore. According to the story, the poet Simonides of Ceos was contracted by Scopas, a Thessalian nobleman, to compose a poem in his honor.  To the nobleman’s chagrin, Simonides devoted half of his oration to the praise of the gods Castor and Pollux.  Feeling himself cheated out of half of the honor, Scopas brusquely paid Simonides only half the agreed upon fee and told him to seek the rest from the twin gods.  Not long afterward that same evening, Simonides was summoned from the banqueting table by news that two young men were calling for him at the door.  Simonides sought the two callers, but found no one.  While he was out of the house, however, the roof caved in killing all of those gathered around the table including Scopas. As Yates puts it, “The invisible callers, Castor and Pollux, had handsomely paid for their share in the panegyric by drawing Simonides away from the banquet just before the crash.”

Cicero

The bodies of the victims were so disfigured by the manner of death that they were rendered unidentifiable even by family and friends.  Simonides, however, found that he was able to recall where each person was seated around the table and in this way he identified each body.  This led Simonides to the realization that place and image were the keys to memory, and in this case, also a means of preserving identity through the calamity of death.  In Cicero’s words,

[Simonides] inferred that persons desiring to train [their memory] must select places and form mental images of the things they wish to remember and store those images in the places, so that the order of the places will preserve the order of the things, and the images of the things will denote the things themselves, and we shall employ the places and images respectively as a wax writing-tablet and the letters written on it.

Cicero is one of three classical sources on the principles of artificial memory that evolved in the ancient world as a component of rhetorical training.  The other two sources are Quintilian’s Institutio oratoria and the anonymous Ad Herennium.  It is through the Ad Herennium, mistakenly attributed to Cicero, that the art of memory migrates into Medieval culture where it is eventually assimilated into the field of ethics.  Cicero’s allusion to the wax-writing table, however, reminds us that discussion of memory in the ancient world was not limited to the rhetorical schools.  Memory as a block of wax upon which we make impressions is a metaphor attributed to Socrates in Plato’s Theaetetus where it appears as a gift of Mnemosyne, the mother of the muses:

Imagine, then, for the sake of argument, that our minds contain a block of wax, which in this or that individual may be larger or smaller, and composed of wax that is comparatively pure or muddy, and harder in some, softer in others, and sometimes of just the right consistency.

Let us call it the gift of the Muses’ mother, Memory, and say that whenever we wish to remember something we see or hear or conceive in our own minds, we hold this wax under the perceptions or ideas and imprint them on it as we might stamp the impressions of a seal ring.  Whatever is so imprinted we remember and know so long as the image remains; whatever is rubbed out or has not succeeded in leaving an impression we have forgotten and do not know.

Plato and Aristotle in Rafeal's "School of Athens"

The Platonic understanding of memory was grounded in a metaphysic and epistemology which located the ability to apprehend truth in an act of recollection.  Plato believed that the highest forms of knowledge were not derived from sense experience, but were first apprehended by the soul in a pre-existent state and remain imprinted deep in a person’s memory.  Truth consists in matching the sensible experience of physical reality to the imprint of eternal Forms or Ideas whose images or imprints reside in memory.  Consequently the chief aim of education is the remembering of these Ideas and this aim is principally attained through “dialectical enquiry,” a process, modeled by Plato’s dialogs, by which a student may arrive at a remembering of the Ideas.

At this point, we should notice that the anteriority, or “pastness,” of the knowledge in question is, strictly speaking, incidental.  What is important is the presence of the absent Idea or Form.  It is to evoke the presence of this absence that remembering is deployed.  It is the presence of eternal Ideas that secures the apprehension of truth, goodness, or beauty in the present.  Locating the memory within the span of time past does not bear upon its value which rests in its being possessed as a model against which to measure experience.

Paul Ricoeur, in Memory, History, Forgetting, begins his consideration of the heritage of Greek reflections on memory with the following observation:

Socratic philosophy bequeathed to us two rival and complementary topoi on this subject, one Platonic, the other Aristotelian.  The first, centered on the theme of the eikōn [image], speaks of the present representation of an absent thing; it argues implicitly for enclosing the problematic of memory within that of imagination.  The second, centered on the theme of the representation of a thing formerly perceived, acquired, or learned, argues for including the problematic of the image within that of remembering.

As he goes on to note, from these two framings of the problematic of memory “we can never completely extricate ourselves.”

Reflecting for just a moment on the nature of our own memories it is not difficult to see why this might be the case.  If we remember our mother, for example, we may do so either by contemplating some idealized image of her in our mind’s eye or else by recollecting a moment from our shared past.  In both cases we may be said to be remembering our mother, but the memories differ along the Platonic/Aristotelian divide suggested by Ricoeur.  In the former case I remember her in a way that seeks her presence without reference to time past; in the latter, I remember her in a way that situates her chronologically in the past.

At this point, I’m sure it seems that we’ve wandered a bit from the art of memory and father still from social networking sites.  There is a method to this madness, however, but demonstrating that will have to wait for the next post.  Already, I am pushing the limits of acceptable blog post length.

Looking forward to the next post in this series, then, here are the tasks that remain:

  • Exploring memory as an index of desire.
  • Setting the art of memory tradition, and Facebook, within Ricoeur’s schema.
  • Asking what difference all of this makes.

Social Media and the Arts of Memory

UPDATE: A complete and updated form of the essay began here is now available here.

Early in The Art of Memory, Frances Yates pauses to envision a “forgotten social habit” of the classical world.  She invites us to wonder,“Who is that man moving slowly in the lonely building, stopping at intervals with an intent face?”  That man, Yates tells us, is a “rhetoric student forming a set of memory loci.”  The rhetoric student would have been creating the architecture of a mental space into which they would then place vivid images imaginatively designed to recollect the themes or words of a prepared oration.  While delivering the oration, the rhetor would navigate the mental space coming upon each carefully placed image which triggered their memory accordingly.  This work of designing mental space and populating the space with striking images followed the prescriptions of the artificial memory techniques widely practiced in classical antiquity.

What if, however, we updated Yates’ scene by setting it in the present?  The scene would be instantly recognizable as long as we equipped our imagined person with a cell phone.  The stopping at intervals and the intent face would correspond to any of the multiple uses to which an Internet-enabled smart phone may be put:  reading or sending a text message, downloading songs, taking or sending pictures and video, updating social media profiles, or finding directions with GPS, to name but a few.  What is striking is how often these activities would, like that of the ancient rhetor, involve the work of memory.   Much of what cell phones are increasingly used for has very little to do with making a phone call, after all. In fact, one could argue that the calling feature of phones is becoming largely irrelevant.  Cell phones are more likely to be used to access the Internet, send a text message, take a picture, or film a video.  Given these capabilities cell phones have become prosthetic memory devices; to lose a cell phone would be to induce a state of partial amnesia.  Or, it may be better to say it would might induce a fear of future amnesia since our ability to approach the present as a field of potential memories would be undermined.

Social networking sites (SNS) are of special interest because of the way in which they explicitly trade in memory.  As Joanne Garde-Hansen asks in “MyMemories?:  Personal Digital Archive Fever and Facebook,” “If memory and identity can be seen as coterminous and SNSs involve literally digitising ones’ self into being then what is at stake when memories and identities are practiced on Facebook?”

She goes on to add,

It goes without saying that the allure of the site is in its drawing together in one place memory practices: creating photograph albums, sharing photographs, messaging, joining groups and alumni memberships, making ‘friends’ and staying in touch with family.

It would be fair to acknowledge that SNS such as Facebook traffic in more than the allure of memory practices. Nonetheless, the production, maintenance, and retrieval of memory is integral to the practices deployed on SNSs.

Following Jacques Derrida, Garde-Hansen considers Facebook as an instance of the archiving archive. Thus, she points out, the architecture of a SNS such as Facebook is not neutral with respect to the memories it archives.  As Derrida observed,

… the technical structure of the archiving archive also determines the structure of the archivable content even in its very coming into existence and in its relationship to the future.  The archivization produces as much as it records the event.

Garde-Hansen also draws SNSs into the conflict between database and narrative staged by Manovich in The Language of New Media.  In her view, the most significant aspect of Manovich’s analysis of new media for SNSs is the comparison he draws between the visual organization of new media interfaces and spatial montage.  “Manovich’s emphasis upon spatial narrative is,” according to Garde-Hansen, “extremely pertinent to thinking through the emergence of SNSs and how these sites remediate personal and collective memory.” Framed in this way, memory as spatial montage challenges “the rise and dominance of history,” the “power of the written word” to order the past temporally, and the “twentieth century’s emphasis upon the single, unadulterated image (think cinema).”

Derrida’s insight suggests that given the way the architecture of an archive already determines what can in fact be archived, the future record of the past is already impinging upon the present.  Or, put otherwise, the sorts of memories we are able to generate with social media may already be directing our interactions in the present.  (For an insightful discussion of this point anchored on an analysis of faux-vintage photography see Nathan Jurgenson’s, “The Faux-Vintage Photo.”)  Drawing in Manovich’s database/narrative opposition, suggests that the visual/spatial mode of ordering memories on SNSs potentially shifts how meaning is derived from memory and how we construct the self.  We’ll explore both of these considerations a little further in subsequent posts.

Returning to the scene suggested by Yates, however, we may also consider SNSs such as Facebook as instances of new artificial memory spaces constructed to supplement and augment the natural memory.  Already in the artificial memory tradition we have memory rendered spatially and visually in a manner that anticipates the representation and organization of memory in SNSs.  Situating SNSs within the long history of spatial and visual memory also affords us the opportunity to consider SNSs in the context of a complex and rich tradition of philosophical reflection.

What emerges is a history of memory practices that alternate between a Platonic focus on memory as the presence of an absence and an Aristotelian emphasis on memory as the record of the past.  There are several thematic threads that weave this story together including the opposition of internal memory to memory supported by inscription, the connection between memory and imagination, memory as the index of desire, the related tensions between space and time and databases and narratives, and the relationship of memory to identity.  Yet for all the complexity those themes introduce, we will begin our next post in this series with a story.

___________________________________________________

Joanne Garde-Hansen’s article is found in Save As… Digital Memories.

Digital Legacies, Online Souls, and the Burdens of Remembering

Here is a story that came out of Melrose, Massachusetts yesterday:

According to the Boston Globe, no fewer than 11 Melrose varsity athletes were recently identified in illegal possession of alcohol or tobacco in photos which first surfaced on Facebook. The photos were taken from the site by a concerned parent, transferred to a thumb drive and submitted to the school’s administration as proof of inappropriate actions by the student body.

By now these stories are not too surprising; the only surprise may be in the fact that people are still being this careless with social media and incriminating or embarrassing images.  But then we realize that we are still learning to cope with what our media has enabled; to paraphrase W. H. Auden, we clearly cannot understand, what we can clearly do.  Writing in New Scientist, Sumit Paul-Choudhury recently observed,

Right now, though, we are living through a truly unique period in human history. We have learned how to create vast digital legacies but we don’t yet know how to tidy them up for our successors. The generations that went before us left no digital trace, and the ones that go after might leave nothing but sanitised “authorised biographies”. We will be defined through piecemeal and haphazard collections of our finest and foulest moments.

Paul-Choudhury’s article, “Digital legacy: The fate of your online soul,” explores how we are learning to manage the collection of digital data which becomes our online legacy, or what Hans-Peter Brondmo has evocatively called our “digital soul.”  Paul-Choudhury is, perhaps, more suited than most to write about this issue since six years ago he created a site to commemorate the life of his wife who passed away after a long battle with cancer.  Then it may have seemed a bit strange to wonder about one’s digital legacy, by now there are conferences and organizations devoted to the matter and plenty of  media attention as well.

In his article, Paul-Choudhury argues that we have not quite figured out how to manage the wealth of information that will survive us, and he identifies two principle approaches that have evolved in recent years:  the “preservationists” and the “deletionists.”  The names, I suspect, are self-explanatory.  Some argue that we should seek to save and preserve every last bit of data that we can, and others suggest that the more humane approach involves learning how to allow the possibility of forgetting.

The most prominent name on either side is probably Viktor Mayer-Schonberger (this post, incidentally, is apparently evolving a hyphenated name theme), the author of the 2009 book, Delete: The Virtue of Forgetting in the Digital Age which argued for the wisdom of creating tools that enabled the gradual erasure of online data over time.  He recommends that digital data be constructed so as to “digitally rust” to allow for a graceful forgetting that mimics the ordinary workings of human memory.  In his view, the problem is that we are now remembering too much and this finally becomes unbearable.  It reminds me of Paul Ricoeur’s suggestion that while in the  past learning the arts of memory was a prized skill, we may now need to cultivate an art of forgetting.

As we’ve noted before, however, others have pointed out that digital data is not nearly so resilient and durable as we are led to believe.  In fact, unfathomably large amounts of data are lost each year.  Paul-Choudhury cites Marc Weber of the Computer History Museum who observes that, “Digital records are more like an oral tradition than traditional documents, if you don’t copy them regularly, they simply disappear.” Web sites go dark, files are corrupted, servers fail — all of this does render our digital legacy a bit more fragile than we might imagine.

On the whole, however, it does seem as if most of us are generating much more by way of relatively permanent and accessible memories and remembrances than most of our forebears, except perhaps the most assiduous diarists, would have been capable of producing.  Of course, the contrast with the diarist may be precisely the issue at the heart of the debate between the preservationists and the deletionists.  The diarist filters and orders their memories, much of what we are storing is not quite so filtered and ordered.  Perhaps more importantly, the diary was private.  Our digital memories are mostly public.

On the one hand this enhances the memories.  For example, Paul-Choudhury notes that his photos on Flickr “are surrounded by a rich history of social interactions, from groups to which I belong to comments that friends have left about my photos.  That feels  just as much part of my digital soul as the images themselves.”  On the other hand, it is the public nature of the memories that renders them susceptible to malicious use by others and makes it more difficult to curate our online identity.

While it is intriguing to think about the digital legacy that will survive us, it may be more pressing to think about the significance of these issues for us while we remain in the land of the living.  Writing about the nineteenth century in Present Past, Richard Terdiman identified what he called a “crisis of memory” arising from the temporal dislocations associated with the French Revolution.  Memory or history no longer seemed a continuous flow into which modern people could easily place themselves.  Perhaps we may begin to think about a contemporary crisis of memory that arises not from our dislocation from tradition, but from our dislocation from our memories themselves.  Our memory may be at once, too present, and not present enough.  Now it is not only that we are unable to integrate our own lives with cultural memory, we may be having a hard time integrating our our own memories with our experience of life.

It is true that much of our memory has always to some degree been external to us, but sometimes degrees of difference amount to qualitative change.  We have been busily uploading our memories over the last several years creating an artificial memory of truly unprecedented scope.  The consequences of this exponential growth of memory are as of yet unclear, but it may be that we are creating for ourselves a year of cruel Aprils.

April is the cruelest month, breeding
Lilacs out of the dead land, mixing
Memory and desire, stirring
Dull roots with spring rains.