How to Think About Memory and Technology

I suppose it is the case that we derive some pleasure from imagining ourselves to be part of a beleaguered but noble minority. This may explain why a techno-enthusiast finds it necessary to attack dystopian science fiction on the grounds that it is making us all fear technology while I find that same notion ludicrous.

Likewise, Salma Noreen closes her discussion of the internet’s effect on memory with the following counsel: “Rather than worrying about what we have lost, perhaps we need to focus on what we have gained.” I find that a curious note on which to close because I tend to think that we are not sufficiently concerned about what we have lost or what we may be losing as we steam full speed ahead into our technological futures. But perhaps I also am not immune to the consolations of belonging to an imagined beleaguered community of my own.

So which is it? Are we a society of techno-skeptics with brave, intrepid techno-enthusiasts on the fringes stiffening our resolve to embrace the happy technological future that can be ours for the taking? Or are we a society of techno-enthusiasts for whom the warnings of the few techno-skeptics are nothing more than a distant echo from an ever-receding past?

I suspect the latter is closer to the truth, but you can tell me how things look from where you’re standing.

My main concern is to look more closely at Noreen’s discussion of memory, which is a topic of abiding interest to me. “What anthropologists distinguish as ‘cultures,’” Ivan Illich wrote, “the historian of mental spaces might distinguish as different ‘memories.'” And I rather think he was right. Along similar lines, and in the early 1970s, George Steiner lamented, “The catastrophic decline of memorization in our own modern education and adult resources is one of the crucial, though as yet little understood, symptoms of an afterculture.” We’ll come back to more of what Steiner had to say a bit further on, but first let’s consider Noreen’s article.

She mentions two studies as a foil to her eventual conclusion. The first suggesting “the internet is leading to ‘digital amnesia’, where individuals are no longer able to retain information as a result of storing information on a digital device,” and the other “that relying on digital devices to remember information is impairing our own memory systems.”

“But,” Noreen counsels her readers, “before we mourn this apparent loss of memory, more recent studies suggest that we may be adapting.” And in what, exactly, does this adaptation consist? Noreen summarizes it this way: “Technology has changed the way we organise information so that we only remember details which are no longer available, and prioritise the location of information over the content itself.”

This conclusion seems to me banal, which is not to say that it is incorrect. It amounts to saying that we will not remember what we do not believe we need to remember and that, when we have outsourced our memory, we will take some care to learn how we might access it in the future.

Of course, when the Google myth dominates a society, will we believe that there is anything at all that we ought to commit to memory? The Google myth in this case is the belief that the every conceivable bit of knowledge that we could ever possibly desire is just a Google search away.

The sort of analysis Noreen offers, which is not uncommon, is based on an assumption we should examine more closely and also leaves a critical consideration unaddressed.

The assumption is that there are no distinctions within the category of memory. All memories are assumed to be discreet facts of the sort which one would need to know in order to do well on Jeopardy. But this assumption ignores the diversity of what we call memories and the diversity of functions to which memory is put. Here is how I framed the matter some years back:

All of this leads me to ask, What assumptions are at play that make it immediately plausible for so many to believe that we can move from internalized memory to externalized memory without remainder?  It would seem, at least, that the ground was prepared by an earlier reduction of knowledge to information or data.  Only when we view knowledge as the mere aggregation of discreet bits of data, can we then believe that it makes little difference whether that data is stored in the mind or in a database.

We seem to be approaching knowledge as if life were a game of Jeopardy which is played well by merely being able to access trivial knowledge at random.  What is lost is the associational dimension of knowledge which constructs meaning and understanding by relating one thing to another and not merely by aggregating data.  This form of knowledge, which we might call metaphorical or analogical, allows us to experience life with the ability to “understand in light of,” to perceive through a rich store of knowledge and experience that allows us to see and make connections that richly texture and layer our experience of reality.

But this understanding of memory seems largely absent from the sorts of studies that are frequently cited in discussions of offloaded or outsourced memory. I’ll add another relevant consideration I’ve previously articulated, that there is a silent equivocation that slips into these discussions: the notion of memory we tend to assume is our current understanding of memory derived by comparison to computer memory, which is essentially storage.

Having first identified a computer’s storage  capacity as “memory,” a metaphor dependent upon the human capacity we call “memory,” we have now come to reverse the direction of the metaphor by understanding human “memory” in light of a computer’s storage capacity.  In other words we’ve reduced our understanding of memory to mere storage of information.  And now we read all discussions of memory in light of this reductive understanding.

As for the unaddressed critical consideration, if we grant that we must all outsource or externalize some of our memory, and that it may even be admittedly advantageous to do so, how do we make qualitative judgments about the memory that we can outsource to our benefit and the memory we should on principle internalize (if we even allow for the latter possibility)?

Here we might take a cue from the religious practices of Jews, Christians, and Muslims, who have long made the memorization of Scripture a central component of their respective forms of piety. Here’s a bit more from Steiner commenting on what can be known about early modern literacy:

Scriptural and, in a wider sense, religious literacy ran strong, particularly in Protestant lands. The Authorized Version and Luther’s Bible carried in their wake a rich tradition of symbolic, allusive, and syntactic awareness. Absorbed in childhood, the Book of Common Prayer, the Lutheran hymnal and psalmody cannot but have marked a broad compass of mental life with their exact, stylized articulateness and music of thought. Habits of communication and schooling, moreover, sprang directly from the concentration of memory. So much was learned and known by heart — a term beautifully apposite to the organic, inward presentness of meaning and spoken being within the individual spirit.

Learned by heart–a beautifully apt phrase, indeed. Interestingly, this is an aspect of religious practice that, while remaining relatively consistent across the transition from oral to literate society, appears to be succumbing to the pressures of the Google myth, at least among Protestants. If I have an app that lets me instantly access any passage of my sacred text, in any of a hundred different translations, why would I bother to memorize any of it.

The answer, of course, best and perhaps only learned by personal experience, is that there is a qualitative difference between the “organic, inward presentness of meaning” that Steiner describes and merely knowing that I know how to find a text if I were inclined to find it. But the Google myth, and the studies that examine it, seem to know nothing of that qualitative difference, or, at least, they choose to bracket it.

I should note in passing that much of what I have recently written about attention is also relevant here. Distraction is the natural state of someone who has no goal that might otherwise command or direct their attention. Likewise, forgetfulness is the natural state of someone who has no compelling reason to commit something to memory. At the heart of both states may be the liberated individual will yielded by modernity. Distraction and forgetfulness seem both to stem from a refusal to acknowledge an order of knowing that is outside of and independent of the solitary self. To discipline our attention and to learn something by heart is, in no small measure, to submit the self to something beyond its own whims and prerogatives.

So, then, we might say that one of the enduring consequences of new forms of externalized memory is not only that they alter the quantity of what is committed to memory but that they also reconfigure the meaning and value that we assign to both the work of remembering and to what is remembered. In this way we begin to see why Illich believed that changing memories amounted to changing cultures. This is also why we should consider that Plato’s Socrates was on to something more than critics give him credit for when he criticized writing for how it would affect memory, which was for Plato much more than merely the ability to recall discreet bits of data.

This last point brings me, finally, to an excellent discussion of these matters by John Danaher. Danaher is always clear and meticulous in his writing and I commend his blog, Philosophical Disquisitions, to you. In this post, he explores the externalization of memory via a discussion of a helpful distinction offered by David Krakauer of the Santa Fe Institute. Here is Danaher’s summary of the distinction between two different types of cognitive artifacts, or artifacts we think with:

Complementary Cognitive Artifacts: These are artifacts that complement human intelligence in such a way that their use amplifies and improves our ability to perform cognitive tasks and once the user has mastered the physical artifact they can use a virtual/mental equivalent to perform the same cognitive task at a similar level of skill, e.g. an abacus.

Competitive Cognitive Artifacts: These are artifacts that amplify and improve our abilities to perform cognitive tasks when we have use of the artifact but when we take away the artifact we are no better (and possibly worse) at performing the cognitive task than we were before.

Danaher critically interacts with Krakauer’s distinction, but finds it useful. It is useful because, like Albert Borgmann’s work, it offers to us concepts and categories by which we might begin to evaluate the sorts of trade-offs we must make when deciding what technologies we will use and how.

Also of interest is Danaher’s discussion of cognitive ecology. Invoking earlier work by Donald Norman, Danaher explains that “competitive cognitive artifacts don’t just replace or undermine one cognitive task. They change the cognitive ecology, i.e. the social and physical environment in which we must perform cognitive tasks.” His critical consideration of the concept of cognitive ecology brings him around to the wonderful work Evan Selinger has been doing on the problem of technological outsourcing, work that I’ve cited here on more than a few occasions. I commend to you Danaher’s post for both its content and its method. It will be more useful to you than the vast majority of commentary you might otherwise encounter on this subject.

I’ll leave you with the following observation by the filmmaker Luis Bunuel: “Our memory is our coherence, our reason, our feeling, even our action. Without it, we are nothing.” Let us take some care and give some thought, then, to how our tools shape our remembering.

 

An Addendum: On Memory and Loss

Despite just writing an overlong post on memory, I find myself thinking about it still.

I’m intrigued by how the past, in an age of pervasive documentation, never quite achieves the condition of being past. Our social media streams constitute the past as an always accessible substratum of the present. The past emanates like low-level radiation from this substratum, not always detectable but always having its effect.

I don’t mean by this what Faulkner meant when he said, “The past is never dead. It’s not even past.” Of course, the past is always operative in the present. Instead, I’m thinking of the past in a more limited, individual sense. My past history, as it were, or my past selves.

When I think about my life before the advent of pervasive documentation, I experience it as something irretrievably lost. Memory acts as a temporary bridge to that world, but the gap it bridges is deep and uncrossable. Indeed, if memory is like a bridge, it is a bridge that at best spans the gap only far enough to bring the other side into hazy view. But this yields for me the experience of having lost aspects of myself, it grants the realization that I am not now who I was then, at least not entirely. Sometimes I tempted to bridge the gap altogether, to recover something of what has been lost, but failure in the attempt always confirms the futility of the project.

But does this development or, hopefully, growth partly depend on a natural forgetting of the past self, that is of the opening up of those chasm dug by loss? I suppose what I am asking can be put this way: Is forgetting or allowing parts of ourselves to be lost to the past an essential aspect of personal development?

And, if that proves to be the case, then how does our present economy of persistent remembering, driven by the desire “save everything,” effect this dynamic?

Google Photos and the Ideal of Passive Pervasive Documentation

I’ve been thinking, recently, about the past and how we remember it. That this year marks the 20th anniversary of my high school graduation accounts for some of my reflective reminiscing. Flipping through my senior yearbook, I was surprised by what I didn’t remember. Seemingly memorable events alluded to by friends in their notes and more than one of the items I myself listed as “Best Memories” have altogether faded into oblivion. “I will never forget when …” is an apparently rash vow to make.

But my mind has not been entirely washed by Lethe’s waters. Memories, assorted and varied, do persist. Many of these are sustained and summoned by stuff, much of it useless, that I’ve saved for what we derisively call sentimental reasons. My wife and I are now in the business of unsentimentally trashing as much of this stuff as possible to make room for our first child. But it can be hard parting with the detritus of our lives because it is often the only tenuous link joining who we were to who we now are. It feels as if you risk losing a part of yourself forever if you were to throw away that last delicate link.

“Life without memory,” Luis Bunuel tells us, “is no life at all.” “Our memory,” he adds, “is our coherence, our reason, our feeling, even our action. Without it, we are nothing.” Perhaps this accounts for why tech criticism was born in a debate about memory. In the Phaedrus, Plato’s Socrates tells a cautionary tale about the invention of writing in which writing is framed as a technology that undermines the mind’s power to remember. What we can write down, we will no longer know for ourselves–or so Socrates worried. He was, of course, right. But, as we all know, this was an incomplete assessment of writing. Writing did weaken memory in the way Plato feared, but it did much else besides. It would not be the last time critics contemplated the effects of a new technology on memory.

I’ve not written nearly as much about memory as I once did, but it continues to be an area of deep interest. That interest was recently renewed not only by personal circumstances but also by the rollout of Google Photos, a new photo storage app with cutting edge sorting and searching capabilities. According to Steven Levy, Google hopes that it will be received as a “visual equivalent to Gmail.” On the surface, this is just another digital tool designed to store and manipulate data. But the data in question is, in this case, intimately tied up with our experience and how we remember it. It is yet another tool designed to store and manipulate memory.

When Levy asked Bradley Horowitz, the Google executive in charge of Photos, what problem does Google Photos solve? Horowitz replied,

“We have a proliferation of devices and storage and bandwidth, to the point where every single moment of our life can be saved and recorded. But you don’t get a second life with which to curate, review, and appreciate the first life. You almost need a second vacation to go through the pictures of the safari on your first vacation. That’s the problem we’re trying to fix — to automate the process so that users can be in the moment. We also want to bring all of the power of computer vision and machine learning to improve those photos, create derivative works, to make suggestions…to really be your assistant.”

It shouldn’t be too surprising that the solution to the problem of pervasive documentation enabled by technology is a new technology that allows you to continue documenting with even greater abandon. Like so many technological fixes to technological problems, it’s just a way of doubling down on the problem. Nor is it surprising that he also suggested this would help users “be in the moment” without of a hint of irony.

But here is the most important part of the whole interview, emphasis mine:

“[…] so part of Google photos is to create a safe space for your photos and remove any stigma associated with saving everything. For instance, I use my phone to take pictures of receipts, and pictures of signs that I want to remember and things like that. These can potentially pollute my photo stream. We make it so that things like that recede into the background, so there’s no cognitive burden to actually saving everything.”

Replace saving with remembering and the potential significance of a tool like Google Photos becomes easier to apprehend. Horowitz is here confirming that users will need to upload their photos to Google’s Cloud if they want to take advantage of Google Photos’ most impressive features. He anticipates that there will be questions about privacy and security, hence the mention of safety. But the really important issue here is this business about saving everything.

I’m not entirely sure what to make of the stigma Horowitz is talking about, but the cognitive burden of “saving everything” is presumably the burden of sorting and searching. How do you find the one picture you’re looking for when you’ve saved thousands of pictures across a variety of platforms and drives? How do you begin to organize all of these pictures in any kind of meaningful way? Enter Google Photos and its uncanny ability to identify faces and group pictures into three basic categories–People, Places, and Things–as well as a variety of sub-categories such as “food,” “beach,” or “cars.” Now you don’t need that second life to curate your photos. Google does it for you. Now we may document our lives to our heart’s content without a second thought about whether or not we’ll ever go back to curate our unwieldy hoard of images.

I’ve argued elsewhere that we’ve entered an age of memory abundance, and the abundance of memories makes us indifferent to them. When memory is scarce, we treasure it and care deeply about preserving it. When we generate a surfeit of memory, our ability to care about it diminishes proportionately. We can no longer relate to how Roland Barthes treasured his mother’s photograph; we are more like Andy Warhol, obsessively recording all of his interactions and never once listening to the recordings. Plato was, after all, even closer to the mark than we realized. New technologies of memory reconfigure the affections as well as the intellect. But is it possible that Google Photos will prove this judgement premature? Has Google figured out how we may have our memory cake and eat it too?

I think not, and there’s a historical precedent that will explain why.

Ivan Illich, in his brilliant study of medieval reading and the evolution of the book, In the Vineyard of the Text, noted how emerging textual technologies reconfigured how readers related to what they read. It is a complex, multifaceted argument and I won’t do justice to it here, but the heart of it is summed up in the title of Illich’s closing chapter, “From Book to Text.” After explaining what Illich meant by the that formulation, I’m going to suggest that we consider an analogous development: from photograph to image.

Like the photography, writing is, as Plato understood, a mnemonic technology. The book or codex is only one form the technology has taken, but it is arguably the most important form owing to its storage capacity and portability. Contrast the book to, for instance, a carved stone tablet or a scroll and you’ll immediately recognize the brilliance of the design. But the matter of sorting and searching remained a significant problem until the twelfth century. It is then that new features appeared to improve the book’s accessibility and user-friendliness, among them chapter titles, pagination, and the alphabetized index. Now one cloud access particular passages without having to either read the whole work or, more to the point, either memorize the passages or their location in the book (illuminated manuscripts were designed to aide with the latter).

My word choice in describing the evolution of the book above was, of course, calculated to make us see the book as a technology and also to make certain parallels to the case of digital photography more obvious. But what was the end result of all of this innovation? What did Illich mean by saying that the book became a text?

Borrowing a phrase Katherine Hayles deployed to describe a much later development, I’d say that Illich is getting at one example of how information lost its body. In other words, prior to these developments it was harder to imagine the text of a book as a free-floating reality that could be easily lifted and presented in a different format. The ideas, if you will, and the material that conveyed them–the message and medium–were intimately bound together; one could hardly imagine the two existing independently. This had everything to do with the embodied dimensions of the reading experience and the scarcity of books. Because there was no easy way to dip in and out of a book to look for a particular fragment and because one would likely encounter but one copy of a particular work, the work was experienced as a whole that lived within the particular pages of the book one held in hand.

The book had then been read reverentially as a window on the world; it yielded what Illich termed monastic reading. The text was later, after the technical innovations of the twelfth century, read as a window on the mind of the author; it yielded scholastic reading. We might also characterize these as devotional reading and academic reading, respectively. Illich summed it up this way:

“The text could now be seen as something distinct from the book. It was an object that could be visualized even with closed eyes [….] The page lost the quality of soil in which words are rooted. The new text was a figment on the face of the book that lifted off into autonomous existence [….] Only its shadow appeared on the page of this or that concrete book. As a result, the book was no longer the window onto nature or god; it was no longer the transparent optical device through which a reader gains access to creatures or the transcendent.”

Illich had, a few pages earlier, put the matter more evocatively: “Modern reading, especially of the academic and professional type, is an activity performed by commuters or tourists; it is no longer that of pedestrians and pilgrims.”

I recount Illich’s argument because it illuminates the changes we are witnessing with regards to photography. Illich demonstrated two relevant principles. First, that small technical developments can have significant and lasting consequences for the experience and meaning of media. The move from analog to digital photography should naturally be granted priority of place, but subsequent developments such as those in face recognition software and automated categorization should not be underestimated. Secondly, that improvements in what we might today call retrieval and accessibility can generate an order of abstraction and detachment from the concrete embodiment of media. And this matters because the concrete embodiment, the book as opposed to the text, yields kinds and degrees of engagement that are unique to it.

Let me try to put the matter more directly and simultaneously apply it to the case of photography. Improving accessibility meant that readers could approach the physical book as the mere repository of mental constructs, which could be poached and gleaned at whim. Consequently, the book was something to be used to gain access to the text, which now appeared for the first time as an abstract reality; it ceased to be itself a unique and precious window on the world and its affective power was compromised.

Now, just as the book yielded to the text, so the photograph yields to the image. Imagine a 19th century woman gazing lovingly at a photograph of her son. The woman does not conceive of the photograph as one instantiation of the image of her son. Today, however, we who hardly ever hold photographs anymore, we can hardly help thinking it terms of images, which may be displayed on any of a number of different platforms, not to mention manipulated at whim. The image is an order of abstraction removed from the photograph and it would be hard to imagine someone treasuring it in the same way that we might treasure an old photograph. Perhaps a thought experiment will drive this home. Try to imagine the emotional distance between the act of tearing up a photograph and deleting an image.

Now let’s come back to the problem Google Photos is intended to solve. Will automated sorting and categorization along with the ability to search succeed in making our documentation more meaningful? Moreover, will it overcome the problems associated with memory abundance? Doubtful. Instead, the tools will facilitate further abstraction and detachment. They are designed to encourage the production of even more documentary data and to further diminish our involvement in their production and storage. Consequently, we will continue to care less not more about particular images.

Of course, this hardly means the tools are useless or that images are meaningless. I’m certain that face recognition software, for instance, can and will be put to all sorts of uses, benign and otherwise and that the reams of data users will feed Google Photos will only help to improve and refine the software. And it is also true that images can be made use of in ways that photographs never could. But perhaps that is the point. A photograph we might cherish; we tend to make use of images. Unlike the useless stuff around which my memories accumulate and that I struggle to throw away, images are all use-value and we don’t think twice about deleting them when they have no use.

Finally, Google’s answer to the problem of documentation, that it takes us out of the moment as it were, is to encourage such pervasive and continual documentation that it is no longer experienced as a stepping out of the moment at all. The goal appears to be a state of continual passive documentation in which case the distinction between experience and documentation blurs so that the two are indistinguishable. The problem is not so much solved as it is altogether transcended. To experience life will be to document it. In so doing we are generating a second life, a phantom life that abides in the Cloud.

And perhaps we may, without stretching the bounds of plausibility too far, reconsider that rather ethereal, heavenly metaphor–the Cloud. As we generate this phantom life, this double of ourselves constituted by data, are we thereby hoping, half-consciously, to evade or at least cope with the unremitting passage of time and, ultimately, our mortality?

Eternal Sunshine of the Spotless Digital Archive

The future is stubbornly resistant to prediction, but we try anyway. I’ve been thinking lately about memory, technologies that mediate our memories, and the future of the past.

The one glaring fact — and I think it is more or less incontrovertible — is this: Digital technology has made it possible to capture and store vast amounts a data.

Much of this data, for the average person, involves the documentation of daily life. This documentation is often photographic or audio-visual.

What difference will this make? Recently, I suggested that in an age of memory abundance, memory will be devalued. There will be too much of it and it will be out there somewhere — in a hard drive, on my phone/computer,  or in the cloud. As we confidently and routinely outsource our remembering to digital devices and archives, we will grow relatively indifferent to personal memories. (Although, I don’t think indifferent is the best word — perhaps unattached.)

This too seems to me incontrovertible. It is the overlooked truth in Plato’s much-maligned critique of writing. Externalized memory is only figuratively related to internalized memory.

But I was assuming the permanence of these digital memories. What if our digital archives prove to be impermanent? What if in the coming years and decades we realize that our digital memories are gradually fading into oblivion?

Consider the following from Bruce Sterling: “Actually it’s mostly the past’s things that will outlive us. Things that have already successfully lived a long time, such as the Pyramids, are likely to stay around longer than 99.9% of our things. It might be a bit startling to realize that it’s mostly our paper that will survive us as data, while a lot of our electronics will succumb to erasure, loss, and bit rot.”

It might turn out that Snapchat is a premonition. What then?

Scenario A: Digital memory decay is a technical problem that is eventually solved; trajectory of memory abundance and consequent indifference plays out.

Scenario B: Digital memory decay remains a persistent problem.

Scenario B1: We devote ourselves to rituals of digital memory preservation. Therapy first referred to the care of the gods. We think of it as care for the self, sometimes involving the recollection repressed memories. Perhaps in the future these senses of the word will mutate into therapy understood as the care of our digital memories.

Scenario B2: By the time the problem of digital memory decay is recognized as a threat, we no longer care. Memory, we decide, is a burden. Mutually reinforcing decay and indifference then yield a creeping amnesia of long term memory. Eternal sunshine indeed.

Scenario B3: We reconsider our digital dependence and reintegrate analog and internalized forms of memory into our ecology of remembrance.

Scenario C: All of this is wrong.

In truth, I can hardly imagine a serious indifference to personal memory. But then again, I’m sure those who lived in societies whose cultural forms were devoted to tribal remembrance could hardly imagine serious indifference to the memory of the tribe. They probably couldn’t imagine someone caring much about their individual history; it was likely an incoherent concept. Thinking about the future involves the thinking of that which we can’t quite imagine, or is it the imagining of that which we can’t quite think. In any case, it’s not really about the future anyway. It’s about trying to make some sense of forces now at work and trying to reckon with the long reach of the past, which, remembered or not, will continue to make itself felt in the present.

From Memory Scarcity to Memory Abundance

The most famous section in arguably the most famous book about photography, Roland Barthes’ Camera Lucida, dwells on a photograph of Barthes’ recently deceased mother taken in a winter garden when she was a little girl. On this picture, Barthes hung his meditative reflections on death and photography. The image evoked both the “that-has-been” reality of the subject, and the haunting “this-will-die” realization. That one photograph of his mother is also the only image discussed by Barthes that was not reproduced in Camera Lucida. It was too personal. It conveyed something true about his mother, but only to him.

But what if Barthes had not a few, but hundreds or even thousands of images of his mother?

I’ve long thought that what was most consequential about social media was their status as prosthetic memories. A site like Facebook, for example, is a massive archive of externalized memories preserved as texts and images. For this reason, it seemed to me, it would be unbearably hard to abandon such sites, particularly for those who had come of age with and through them. These archives bore too precious a record of the past to be simply deleted with a few clicks. I made this argument as late as last night.

But now I’ve realized that I had not fully appreciated the most important dynamic at play. I was operating with assumptions that were formed during an age of relative memory scarcity, but digital photography and sites like Facebook have brought us to an age of memory abundance. The paradoxical consequence of this development will be the progressive devaluing of such memories and severing of the past’s hold on the present. Gigabytes and terabytes of digital memories will not make us care more about those memories, they will make us care less.

We’ve seen the pattern before. Oral societies which had few and relatively inefficient technologies of remembrance at their disposal, lived to remember. Their cultural lives were devoted to ritual and liturgical acts of communal remembering. The introduction of writing, a comparably wondrous technology of remembrance, gradually released the individual from the burdens of cultural remembrance. Memory that could be outsourced, as we say, or offloaded could also be effectively forgotten by  the individual who was free to remember their own history. And it has been to this task that subsequent developments in the technology of remembrance have been put to use. The emergence of cheap paper coupled with rising rates of literacy gave us the diary and the boxes of letters. Photography and the film were also put to the task of documenting our lives. But until recently, these technologies were subject to important constraints. The recording devices were bulky and cumbersome and they were limited in capacity by the number of exposures in a film and the length of ribbon in a tape. There were also important practical constraints on storage and access. Digital technologies have burst through these constraints and they have not yet reached their potential.

Now we carry relatively unobtrusive devices of practically unlimited recording capacity, and these are easily linked to archives that are likewise virtually unlimited in their capacity to store and organize these memories. If we cast our vision into the not altogether distant nor fantastical future, we can anticipate individuals engaging with the world through devices (e.g., Google Glass) that will both augment the physical world by layering it with information and generate a near continuous audio-visual record of our experience.

Compared to these present and soon-to-be technologies, the 35mm camera which was at my disposal through the ’80s and ’90s seems primitive. With regards to a spectrum indicating the capacity to document and archive memories, I was then closer to my pre-modern predecessors than to the generation that will succeed me.

Roland Barthes’ near mystical veneration of his mother’s photograph, touching as it appears to those of us who lived in the age of memory scarcity, will seem quixotic and quaint to those who have known only memory abundance. Barthes will seem to them as those medievals that venerated the physical book do to us. They will be as indifferent to the photograph, and the past it encodes, as we are to the cheap paperback.

It may seem, as it did to me, that social media revived the significance of the past by reconnecting us with friends we would have mostly forgotten and reconstituting habits of social remembering. I’d even expressed concerns that social media might allow the past to overwhelm the present rendering recollection rather than suppression traumatic. But this has only been an effect of novelty upon that transitional generation who had lived without the technology and upon whom it appeared in medias res. For those who have known only the affordances of memory abundance, there will be no reconnection with long forgotten classmates or nostalgic reminiscences around a rare photograph of their youth capturing some trivial, unremembered moment. It will all be documented and archived, but it will mean not a thing.

It will be Barthes’ contemporary, Andy Warhol, who will appear as one of us. In his biography of Warhol, Victor Bockris writes,

Indeed, Andy’s desire to record everything around him had become a mania.  As John Perrault, the art critic, wrote in a profile of Warhol in Vogue:  “His portable tape recorder, housed in a black briefcase, is his latest self-protection device.  The microphone is pointed at anyone who approaches, turning the situation into a theater work.  He records hours of tape every day but just files the reels away and never listens to them.”

Andy Warhol’s performance art will be our ordinary experience, and it is that last line that we should note — “… and he never listens to them.”

Reconsider Plato’s infamous critique of writing. Critics charge Plato with shortsightedness because he failed to see just how much writing would in fact allow us to remember. But from a different perspective, Plato was right. The efficient and durable externalization of memory would makes us personally indifferent to remembrance. As the external archive grows, our personal involvement with the memory it stores shrinks in proportion.

Give me a few precious photographs, a few minutes of grainy film and I will treasure them and hold them dear. Give me one terabyte of images and films and I will care not at all.

In the future, we will float in the present untethered from the past and propelled listlessly onward by the perpetual stream of documentary detritus we will emit.

Sergey Brin, co-founder of Google appear