Offloaded Memory and Its Discontents (or, Why Life Isn’t a Game of Jeopardy)

It is not surprising to learn that we are taking the time to remember less and less given the ubiquitous presence of the Internet and the consequent ability to “Google it” when you need it whatever “it” happens to be.  A new study in the journal Science affirms what most of us have already witnessed or experienced.  According to one report on the study:

Columbia University psychologist Betsy Sparrow and her colleagues conducted a series of experiments, in which they found that people are less likely to remember information when they are aware of its availability on online search engines …

“Our brains rely on the Internet for memory in much the same way they rely on the memory of a friend, family member or co-worker. We remember less through knowing information itself than by knowing where the information can be found,” said Sparrow.

If you sense that something of significance is lost in the transition from internal memory to prosthetic memory, you may also find that it takes a little work to pin point and articulate that loss.  Nicholas Carr, commenting on the same study, also has reservations and he puts them this way:

If a fact stored externally were the same as a memory of that fact stored in our mind, then the loss of internal memory wouldn’t much matter. But external storage and biological memory are not the same thing. When we form, or “consolidate,” a personal memory, we also form associations between that memory and other memories that are unique to ourselves and also indispensable to the development of deep, conceptual knowledge. The associations, moreover, continue to change with time, as we learn more and experience more. As Emerson understood, the essence of personal memory is not the discrete facts or experiences we store in our mind but “the cohesion” which ties all those facts and experiences together. What is the self but the unique pattern of that cohesion?

I’ve articulated some questions about off-loaded memory and the formation of identity in the past few months as well, along with thoughts on the relationship between internal memory and creativity.  Here I want to draw on an observation by Owen Barfield in Saving the Appearances.  Barfield is writing about perception when he notes:

I do not perceive any thing with my sense-organs alone, but with a great part of my whole human being. Thus, I may say, loosely, that I ‘hear a thrush singing’. But in strict truth all that I ever merely ‘hear’ — all that I ever hear simply by virtue of having ears — is sound. When I ‘hear a thrush singing’, I am hearing, not with my ears alone, but with all sorts of other things like  mental habits, memory, imagination, feeling and (to the extent at least that the act of attention involves it) will.  Of a man who merely heard in the first sense, it could meaningfully be said that ‘having ears’ (i. e. not being deaf) ‘he heard not’.

Barfield reminds us that our perception of reality is never merely a function of our senses.  Our perception, which in some respects is to say our interpretation of reality into meaningful experience, is grounded in, among other things, our memory, and certainly not our offloaded memory.  Offloaded memory is not ready-to-hand for our minds to use in its remarkable work of meaning-making.  Perception in this sense is impoverished by our willingness to offload what we might otherwise have internalized.

All of this leads me to ask, What assumptions are at play that make it immediately plausible for so many to believe that we can move from internalized memory to externalized memory without remainder?  It would seem, at least, that the ground was prepared by an earlier reduction of knowledge to information or data.  Only when we view knowledge as the mere aggregation of discreet bits of data, can we then believe that it makes little difference whether that data is stored in the mind or in a database.

We seem to be approaching knowledge as if life were a game of Jeopardy which is played well by merely being able to access trivial knowledge at random.  What is lost is the associational dimension of knowledge which constructs meaning and understanding by relating one thing to another and not merely by aggregating data.  This form of knowledge, which we might call metaphorical or analogical, allows us to experience life with the ability to “understand in light of”, to perceive through a rich store of knowledge and experience that allows us to see and make connections which richly texture and layer our experience of reality.

Augustine famously described memory as a vast storehouse in which there are treasures innumerable. It is our experience of life that is enriched by drawing on the treasures deposited into the storehouse of memory.


Update:  Also see “The Extended Mind — How Google Affects Our Memories” and Johah Lehrer’s post on the study.

There Can Be Only One: Google+ Takes On Facebook

You are most likely not one of the favored few who have been invited to take Google’s new social networking platform out for a spin and neither am I,  but now we get a glimpse of what Google has been up to. When it does go live,  Google+ will open a new front in the battle against Facebook, and one that appears more promising than the ill-fated Google Buzz.

The Google+ experience is in large measure reminiscent of Facebook with at least one major exception:  Circles.   Facebook’s glaring weakness is its insistence that you indiscriminately present the same persona to every one of your “friends,” a list which may include your best friend from childhood, your ex-girlfriend, your boss, your co-workers, your grandmother, and that kid that lived down the street when you were growing up.  Amusingly, Zuckerberg turns moral philosopher on this point and declares that maintaining more than one online identity signals a lack of integrity.  Google+ more sensibly assumes that not all human relationships are created equal and that our social media experience ought to acknowledge that reality.  It allows you to create Circles into which you can drag and drop names from your contact list.  Whenever you post a picture or a link or a comment, you may designate which Circles will be able to see what you have posted. In other words, it lets you effectively manage the presentation of your self to your multifaceted social media audience.  Google+ appears to have thus solved Facebook’s George Costanza problem:  colliding worlds.

Facebook has gestured in this direction with Groups and Friend lists, but this remains an awkward experience, perhaps because it is at odds with the logic at the core of Facebook’s DNA.  Google+, having taken note of the rumblings of discontent with Facebook’s at times cavalier attitude toward privacy,  also allows users to permanently delete their information from Google’s servers and otherwise presents a more privacy-friendly front.

Even with these features aimed at exposing Facebook’s weaknesses and recent news about kinks in Facebook’s armor, Google+ is not expected to challenge Facebook’s social media supremacy.  Inertia is the main obstacle to the success of Google+.  Many users have committed an immense amount of data to their Facebook profiles and Facebook has worked hard to integrate itself into the whole online experience of its users.  Additionally, Facebook has more or less become a memory archive for many of its users and we don’t easily part with our memory.  Most significantly, perhaps, Google+ starts from a position of relative weakness as far as social media platforms are concerned — it has few users.  Most people will, for a long time to come, more readily find those they know on Facebook.

That said, Facebook’s early success against Myspace was predicated on a certain exclusivity.  It may be that an early disadvantage — relatively few members — may present itself as an important advantage in the eyes of enough people to generate momentum for Google+.  It is also hard to tell how many would-be social media users have been kept at bay by Facebook’s shortcoming and will now venture into social media waters given the refinements offered by Google+.  Casual Facebook users may also find it relatively painless to make a move.

Hard to tell from here, as is most of the future, but I wouldn’t be too surprised if Google+ significantly eroded Facebook’s base. Despite, my Highlander-esque title, the most likely outcome may be that both platforms co-exist by appealing to different sets of sensibilities, priorities, and expectations.