“Life without memory is no life at all”

Not too long ago I found myself unable to recall an element of a story from my past.  It was a story I have narrated many times since it occurred nearly 15 years ago.  The event was not insignificant, and what I could no longer remember were my own words.  I could picture the scene.  I could feel what I said.  The words, however, seemed slurred, as if they were on a tape that was being played too slowly.

What a curious thing memory is.  There is so much of each day that we do not remember.  But then there are these episodes that we can revisit repeatedly; many of them, in my case anyhow, so very random, of so little significance.  Yet they stick, they linger, they creep into consciousness for no obvious reason.  And then there are those memories that are like so many beads we string together on the narrative thread of our emplotted lives.  Even these, it seems, are not as durable as we might hope.

Anthony Doerr opens Memory Wall with this reflection from Luis Bunuel’s autobiography, My Last Sigh:

You have to begin to lose your memory, if only in bits and pieces, to realize that memory is what makes our lives.  Life without memory is no life at all, just as an intelligence without the possibility of expression is not really an intelligence.  Our memory is our coherence, our reason, our feeling, even our action.  Without it, we are nothing.

Pattern Recognition: The Genius of our Time?

What counts for genius in our times?  Is it the same as what has always counted for genius?  Or, are there shifting criteria that reflect the priorities and affordances of a particular age?

Mary Carruthers  opens The Book of Memory, her study of  memory in medieval culture, with a contrast between Thomas Aquinas and Albert Einstein.  Both were regarded as the outstanding intellects of their era; each elicited enthusiastic, wonder-struck praise from his contemporaries.  Carruthers cites a letter by an associate of each man as typical of the praise that each received.  Summing up both she writes:

Of Einstein: ingenuity, intricate reasoning, originality, imagination, essentially new ideas couples with the notion that to achieve truth one must err of necessity, deep devotion to and understanding of physics, obstinacy, vital force, single-minded concentration, solitude.  Of Thomas Aquinas: subtlety and brilliance of intellect, original discoveries coupled with deep understanding of Scripture, memory, nothing forgotten and knowledge ever-increasing, special grace, inward recourse, single-minded concentration, intense recollection, solitude.

Carruthers goes on to note how similar the lists of qualities are “in terms of what they needed for their compositional activity (activity of thought), the social isolation required by each individual, and what is perceived to be the remarkable subtlety, originality, and understanding of the product of such reasoning.”  The difference, appropriate to the object of Carruther’s study, lies in the relationship between memory and the imagination.

Carruthers is eager to remind us that “human beings did not suddenly acquire imagination and intuition with Coleridge, having previously been poor clods.”  But there is a difference in the way these qualities were understood:

The difference is that whereas now geniuses are said to have creative imagination which they express in intricate reasoning and original discovery, in earlier times they were said to have richly retentive memories, which they expressed in intricate reasoning and original discovery.

This latter perspective, the earlier Medieval perspective, is not too far removed from the connections between memory and creativity drawn by Jim Holt based on the experiences of French mathematician, Henri Poincare. We might also note that the changing status of memory within the ecology of genius is owed at least in part to the evolution of technologies which supplement the memory.  Aquinas, working in a culture for which books were still relatively scarce, would have needed a remarkably retentive memory to continue working with the knowledge he acquired through reading.  This becomes less of a priority for post-Gutenberg society.

Mostly, however, Carruthers’ comparison suggested to me the question of what might count for genius in our own time.  We are not nearly so far removed from Einstein as Einstein was from Aquinas, but a good deal has changed nonetheless which makes the question at least plausible.  I suspect that, as was the case between Aquinas and Einstein, there will be a good deal of continuity, a kind of base-line of genius perhaps.  But that baseline makes the shifts in emphasis all the more telling.

I don’t have a particular model for contemporary genius in mind, so this is entirely speculative, but I wonder if today, or in coming years, we might not transfer some of the wonder previously elicited by memory and imagination to something like rapid pattern recognition.  I realize there is significant overlap within these categories.  Just as memory and imagination are related in important ways, so pattern recognition is also implicit in both and has always been an important ability.  So again, it is a matter of emphasis.  But it seems to me that the ability to rapidly recognize, or even generate meaningful patterns from an undifferentiated flow of information may be the characteristic of intelligence most suited to our times.

In Aquinas’ day the emphasis was on the memory needed in order to retain the knowledge which was relatively scarce. In Einstein’s time the emphasis was on the ability to jump out of established patterns of thought generated by abundant, but sometimes static knowledge.  In our day, we are overwhelmed by a torrent of easily available and ever shifting information, we won’t quite say knowledge.  Under these conditions memory loses its pride of place, as does perhaps imagination.  However, the ability to draw together disparate pieces of information or to connect seemingly unrelated points of data into a meaningful pattern that we might count as knowledge now becomes a dimension of human intelligence that may inspire comparable awe and admiration from culture drowning in noise.

Perhaps an analogy to wrap up:  Think of the constellations as instances of pattern recognition.  Lot’s of data points against the night sky drawn into patterns that are meaningful, useful, and beautiful to human beings.  For Aquinas the stars of knowledge might appear but for a moment and to recognize the pattern he had to hold in memory their location as he learned and remembered the location of other stars.  For Einstein many more stars had appeared and they remained steadily in his intellectual field of vision, seeing new patterns were old ones had been established was his challenge.

Today we might say that the night sky is not only full to overflowing, but the configuration is constantly shifting.  Our task is not necessarily to remember the location of few but fading stars, nor is it to see new patterns in a fuller but steady field.  It is to constantly recognize new and possibly temporary patterns in a full and flickering field of information. Those who are able to do this most effectively may garner the kind of respect earlier reserved for the memory of Aquinas and the imagination of Einstein.

For a different, but I think related take on a new form of thinking for our age that draws on the imagery of constellations I encourage you to take a look at this thread at Snark Market.

Cell Phone Memories

Much of what we use our cell phones for has very little to do with making a phone call.  In fact, one could argue that the calling feature of our phones is becoming largely irrelevant.  Our cell phones are more likely to be used to access the Internet, send a text message, or take a picture.  Our cell phones have also become memory devices.  Most of us have taken a picture of something we want to remember, a trivial thing perhaps like the name of a book we want to later buy.  The picture is a mental note, except that it is not in the brain.  We set alarms to remind us of meetings, we’ve long since stopped remembering phone numbers, we text directions to ourselves, we send ourselves text messages with reminders, we record the baby’s first words, and the list goes on.  Our cell phones have become an integral part of our memory, to lose them is to find ourselves in a state of partial amnesia.

In a 2007 study, 180 students at London South Bank University between the ages of 19 and 41 were asked to express in one word how they felt when they were without their cell phones.  The responses, reported by Anna Reading in “Memobilia:  The Mobile Phone and the Emergence of Wearable Memories,” included:  uncomfortable, isolated, lost, lonely, disconnected, unsafe, insecure, unguarded, naked, and without time.   This language suggested to Reading that cell phones more or less functioned as an extension of the self and their absence was experienced as the “loss of part of the ‘me’ or part of themselves.”

This, however, was only one side of the story.  Other respondents also used the words free, more private, and peaceful.  This suggested that cell phones also had the effect of generating a panoptic claustrophobia, or a sense of being always available/never alone.  Taken as a whole the study suggests a rather ambivalent relationship with the access and availability cell phones enable.

As the title of her article implies, however, Reading’s focus is on the cell phone as a memory device, and one that is wearable, portable, and social.  The cell phone wearability renders it an extension of the self carried unobtrusively on the body.  Its portability constitutes almost any environment as field of memories waiting to be captured.  Finally, its sociability (my word for her “meme-like qualities,” essentially its connectivity) allows for the instant publication of memories to selected others or more indiscriminate audience via the Internet, particularly social media sites.

This last quality, sociability, blurs the traditional boundary between private and public memory and creates what Jose van Dijk has termed “mediated memory.”  Mediated memory is simultaneously individual and collective.  Every image or video captured say, or every note taken, is ready to be publicized or shared.  We can’t really imagine the sensibility that lead to Roland Barthes’ refusal to include a picture of his mother as a young girl in his book about photography, Camera Lucida.

The second quality, portability, has the interesting effect of making memory something hunted and taken, so to speak, rather than something that is spontaneously generated.  This same move, however, creates a certain detachment from immediate (unmediated) experience and, one could argue, a certain artificiality as well.  This is not much different than the effect of the camera, especially the digital camera.  We can all remember being on vacation and thinking everything we saw needed to be captured with a photograph, so much so that we didn’t experience the vacation so much as we documented it.  In such case my memories are not of time past, but very narrowly of the images I captured.  We don’t always carry a digital camera, however; we always have our cell phones.

Cell phones are by now more or less a taken for granted feature of contemporary life.  They’ve almost blended into the unnoticed and unremarkable background of experience.  It is from this position of ubiquity and transparency that any technology is most likely to have a significant effect on the shape of daily life and our own experience of reality.

_________________________________________________________

Reading’s article can be found in Save As… Digital Memories.

Social Memory, Social Order

“Concerning social memory in particular, we may note that images of the past commonly legitimate a present social order.  It is an implicit rule that participants in any social order must presuppose a shared memory.  To the extent that their memories of a society’s past diverge, to that extent its members can share neither experiences nor assumptions.  The effect is seen perhaps most obviously when communication across generations is impeded by different sets of memories.  Across generations, different sets of memories, frequently in the shape of implicit background narratives, will encounter each other; so that, although physically present to one another in a particular setting, the different generations may remain mentally and emotionally insulated, the memories of one generation locked irretrievably, as it were, in the brains and bodies of that generation …

… images of the past and recollected knowledge of the past … are conveyed and sustained by (more or less ritual) performances …

I believe, furthermore, that the solution to the question posed above — how is the memory of groups conveyed and sustained? — involves bring these two things (recollection and bodies) together …

If there is such a thing as social memory … we are likely to find it in commemorative ceremonies; but commemorative ceremonies prove to be commemorative only in so far as they are performative; performativity cannot be thought without a concept of habit; and habit cannot be thought without a notion of bodily automatisms.”

— Paul Connerton, How Societies Remember, 3-5.

Connerton’s observations, further developed throughout the rest of the book, raise interesting questions about the kind of social order that the personalization and digitization of memory yields.  If Connerton is correct in his claim that a social order rests upon shared memory and that this memory is fundamentally embodied in a quasi-liturgical mode, what becomes of the social order when the memories we most obviously sustain are strictly personal and digitized?

As Connerton also notes in his introduction, this is not merely a technical question, it is also a political question.  If social order hinges on social memory, then, to paraphrase Alasdair MacIntyre, it is worth asking, “Whose memory, which order?”

Social Media, Social Memory: Remembering with Facebook

I’m a casual Facebook user.  I have a profile, I’ve got friends, I occasionally check in.  I rarely post anything other than links to this blog (shameless self-promotion), I don’t post status updates, I haven’t uploaded a picture in over a year.  Clearly, I’m not heavily invested and I’ve posted more than a few critical remarks about Facebook’s hegemony and its consequences on this blog.  But recently I’ve been thinking about Facebook in relationship to memory, memory being a recurring theme of late.

For example, in light of the research article I summarized yesterday, “Is Memory in the Brain? Remembering as Social Behavior”, one could view Facebook as a form of social remembering.  Rather than reminiscing in person, we have asynchronous reminiscences with friends, past and present, often centered on posted photographs.  We might even view tagging photos as a kind of social remembering, a collective curating of shared memories.  Often a very old photograph will be posted by one of a circle of friends, the other friends will be tagged, and a round of reminiscing will follow on the comments.

One other analogy that I’ve been toying with is Facebook as memory theater.  I’ve mentioned memory theaters in at least a couple of past posts (here and here), the basic idea is that one constructs an imagined space in the mind (similar to the work of the Architect in the film Inception, only you’re awake) and then populates the space with images that stand in for certain ideas, people, words, or whatever else you want to remember.  The theory is that we remember images and places better than we do abstract ideas or concepts.  During the Renaissance these mental constructs were sometimes externalized in built structures that housed all sorts of artifacts visually representing the store of human knowledge.

Perhaps it is a stretch, but Facebook seems to function in some respects as kind of externalized memory theater.  Instead of storing speeches or knowledge of the world, it is used to store autobiographical memory.  The architecture of the application is the constructed space and profiles are like images kept in the places, each profile carrying with it by association a trove of particular memories.  Most people report as one of the joys of Facebook the reconnection with an old friend from childhood.  While certainly some of these reconnections lead into renewed and sustained contact, most I imagine do not.  We exchange a message or two, we look over their life as it is now, and then we don’t really keep in touch any better than we used to.  But the memories have been activated, and now their profile takes its place in our memory theater, happily recalling those same memories whenever we like.  The profile is not the friend, of course; it simply becomes a placeholder for particular set of memories.

Facebook taps in to more than one aspect of our psychology.  I have often explained its appeal as a function of our desire to be noticed, to receive attention; and surely this is part of the mix. Lately Facebook’s role in the political sphere has been receiving a good deal of attention.  But it may be that its trade in our memories gives Facebook its uncanny persistence.  Increasingly we hear people taking issue with Facebook’s privacy protocols or otherwise complaining about the pressures of always on social media.  Not too long ago, I noted the grumblings over Facebook’s bid to become the ambient background of the Internet and Zuckerber’s disingenuous push for online “integrity.”  Recent studies have also drawn attention to the potentially negative effect of Facebook on psychological well-being, particularly for women.  But for all of this most people struggle to kill their accounts permanently. Like a bad high school romance, we break up with Facebook, only to flirt and make up, and then break up again.

This begins to make sense when we realize that Facebook has become a prosthetic of our memory.  But not just a prosthetic of memory in general, a simple list on a scrap of paper is that much; it is a prosthetic of our autobiographical memory.  It’s a part of our identity, and it is very difficult to kill off a part of one’s self.

One last thought, only a suggestive one at that: it is one thing to artificially condition one’s memory to store up vast amounts of information about the world or large chunks of poetry, it is quite another to artificially store up one’s autobiographical memory.  Our technology has made the storage of memory cheap and easy, but there is something to be said for forgetting.  The artificial extension of autobiographical memory involves us in some of the more complex regions of human psychology and personality.  We enter into the realm of mourning, catharsis, obsession, fantasy, and more.   We might consider as well that healing and forgetting very often go hand in hand.  In any case, we have a good deal to contemplate.

“They tell, and here is the enigma, that those consulting the oracle of Trophonios in Boetia found there two springs and were supposed to drink from each, from the spring of memory and from the spring of forgetting.”  Jacques Derrida (Memoires for Paul de Man)

__________________________________________________________

If this post was of interest, you may also want to consider Social Media and the Arts of Memory.