The Fog of Life: “Google,” Memory, and Thought

Last week a study suggesting that the ability to Google information is making it less likely that we commit information to memory garnered a decent amount of attention and discussion, including a few of my own thoughts in my last post.  In addition to writing a post on the topic I did something I almost never do for the sake of sanity, I followed the comment thread on a few websites that had posted articles on the story.  That was an instructive experience and has led to a few observations, comments, and questions which I’ll list briefly.

  • Google functions as a synecdoche for the Internet in way that no other company does. So when questions like “Is Google Making Us Stupid?” or “Is Google Ruining Our Memory?” are posed, what is really meant is more like “Is the Internet Making Us Stupid?”, etc.
  • Of course, “Google” is not an autonomous agent, but it has generated and made plausible a certain rhetoric that rather imprudently dismisses the need to remember.
  • People still get agitated by claims that the Internet is either bad or good for you.  Stories are framed in this way in the media, and discussion assumes this binary form.  Not much by way of nuance.
  • On the specific question of memory in relation to this study and the subsequent discussion, it is never quite clear what sort of memory is in view, although it appears that memory for facts or some variation of that is what most people are assuming in their comments.
  • The computer model of the brain is alive and well in people’s imagination.  How else could we explain the recurring claim that by offloading our memory to “Google” we are “freeing up space” in our memory so that our “processing” runs more efficiently.
  • Does anyone really believe that we, members of present society, are generally in danger of reaching the limits of our capacity for memory?
  • There is a concern that tying up memory on the retention of “trivial” facts will hamper our ability to perform higher order tasks such as critical and creative thinking.
  • “Trivial” is relative.  Phone numbers are often given as an example, but while knowing some obscure detail about the human cardio-vascular system might be “trivial” to me, it wouldn’t be so to a cardiovascular surgeon in the midst of an operation.
  • Why are we opposing two forms of knowledge or “intelligence” anyway?  Aren’t most of the people who are able to think critically and creatively about a topic or discipline the same people who have attained a mastery of the details of that same topic or discipline? Isn’t remembering the foundation of knowing, or are not the two at least intimately related?
  • Realizing that total recall of all pertinent facts in most cases is too high a bar, wouldn’t it at least be helpful not to rhetorically oppose facts to thinking?
  • The denigration of memory for facts seems — be warned this is impressionistic — aligned with a slide toward an overarching cloud of vagueness settling over our experience.  Not simply the vagueness by comparison with print disciplined speech that accompanies a return to orality, but a vagueness, distractedness, or inattentiveness  about immediate experience in general.
  • Will we know nothing in particular because we know where to find everything in general?

On that last note, consider Elizabeth Spires’ poem, “A Memory of the Future,” published in The Atlantic, and make of it what you will:

I will say tree, not pine tree.
I will say flower, not forsythia.
I will see birds, many birds,
flying in four directions.

Then rock and cloud will be
lost. Spring will be lost.
And, most terribly,
your name will be lost.

I will revel in a world
no longer particular.
A world made vague,
as if by fog. But not fog.

Vaguely aware,
I will wander at will.
I will wade deeper
into wide water.

You’ll see me, there,
out by the horizon,
an old gray thing,
who finally knows

gray is the most beautiful color.

Offloaded Memory and Its Discontents (or, Why Life Isn’t a Game of Jeopardy)

It is not surprising to learn that we are taking the time to remember less and less given the ubiquitous presence of the Internet and the consequent ability to “Google it” when you need it whatever “it” happens to be.  A new study in the journal Science affirms what most of us have already witnessed or experienced.  According to one report on the study:

Columbia University psychologist Betsy Sparrow and her colleagues conducted a series of experiments, in which they found that people are less likely to remember information when they are aware of its availability on online search engines …

“Our brains rely on the Internet for memory in much the same way they rely on the memory of a friend, family member or co-worker. We remember less through knowing information itself than by knowing where the information can be found,” said Sparrow.

If you sense that something of significance is lost in the transition from internal memory to prosthetic memory, you may also find that it takes a little work to pin point and articulate that loss.  Nicholas Carr, commenting on the same study, also has reservations and he puts them this way:

If a fact stored externally were the same as a memory of that fact stored in our mind, then the loss of internal memory wouldn’t much matter. But external storage and biological memory are not the same thing. When we form, or “consolidate,” a personal memory, we also form associations between that memory and other memories that are unique to ourselves and also indispensable to the development of deep, conceptual knowledge. The associations, moreover, continue to change with time, as we learn more and experience more. As Emerson understood, the essence of personal memory is not the discrete facts or experiences we store in our mind but “the cohesion” which ties all those facts and experiences together. What is the self but the unique pattern of that cohesion?

I’ve articulated some questions about off-loaded memory and the formation of identity in the past few months as well, along with thoughts on the relationship between internal memory and creativity.  Here I want to draw on an observation by Owen Barfield in Saving the Appearances.  Barfield is writing about perception when he notes:

I do not perceive any thing with my sense-organs alone, but with a great part of my whole human being. Thus, I may say, loosely, that I ‘hear a thrush singing’. But in strict truth all that I ever merely ‘hear’ — all that I ever hear simply by virtue of having ears — is sound. When I ‘hear a thrush singing’, I am hearing, not with my ears alone, but with all sorts of other things like  mental habits, memory, imagination, feeling and (to the extent at least that the act of attention involves it) will.  Of a man who merely heard in the first sense, it could meaningfully be said that ‘having ears’ (i. e. not being deaf) ‘he heard not’.

Barfield reminds us that our perception of reality is never merely a function of our senses.  Our perception, which in some respects is to say our interpretation of reality into meaningful experience, is grounded in, among other things, our memory, and certainly not our offloaded memory.  Offloaded memory is not ready-to-hand for our minds to use in its remarkable work of meaning-making.  Perception in this sense is impoverished by our willingness to offload what we might otherwise have internalized.

All of this leads me to ask, What assumptions are at play that make it immediately plausible for so many to believe that we can move from internalized memory to externalized memory without remainder?  It would seem, at least, that the ground was prepared by an earlier reduction of knowledge to information or data.  Only when we view knowledge as the mere aggregation of discreet bits of data, can we then believe that it makes little difference whether that data is stored in the mind or in a database.

We seem to be approaching knowledge as if life were a game of Jeopardy which is played well by merely being able to access trivial knowledge at random.  What is lost is the associational dimension of knowledge which constructs meaning and understanding by relating one thing to another and not merely by aggregating data.  This form of knowledge, which we might call metaphorical or analogical, allows us to experience life with the ability to “understand in light of”, to perceive through a rich store of knowledge and experience that allows us to see and make connections which richly texture and layer our experience of reality.

Augustine famously described memory as a vast storehouse in which there are treasures innumerable. It is our experience of life that is enriched by drawing on the treasures deposited into the storehouse of memory.

____________________________________________________

Update:  Also see “The Extended Mind — How Google Affects Our Memories” and Johah Lehrer’s post on the study.

Virtual Communities, Anonymity, and the Risks of Self-Disclosure

In the beginning Cold War-era engineers created the Internet and then utopian, hippy enthusiasts rescued the Internet and gave it to the world. Regrettably, spammers, venture capitalists, marketers, and corporations entered into the garden of digital delights and a communitarian paradise devolved into a virtual mall.

This, more or less, is the storyline of Evgeny Morozov’s very brief history of the Internet, “Two Decades of the Web: A Utopia No More” in Prospect. The author of The Net Delusion: The Dark Side of Internet Freedom, in case it wasn’t clear from the two titles, is not exactly optimistic about the direction the Internet has taken. The plot of his short overview of Internet history moves from early promise to eventual decline; it is a tale of utopian hopes disciplined by unfortunate realities.

Morozov’s history takes the idea of “virtual community” as its theme and features a set of Internet “cheerleaders” – Stuart Brand, Kevin Kelly, John Perry Barlow, and the Wired crowd – as its tragic protagonists. Tragic because their high-minded, lofty ideals were undercut, in Morozov’s telling, by an accompanying naiveté that left Paradise unguarded against the corporate snakes. “These men,” according to Morozov, “emphasised the importance of community and shared experiences; they viewed humans as essentially good, influenced by rational deliberation, and tending towards co-operation. Anti-Hobbesian at heart, they viewed the state and its institutions as an obstacle to be overcome—and what better way to transcend them than via cyberspace?”

This anarchist/libertarian proclivity exposed the online community to dangers trivial and grave:

Perhaps the mismatch between digital ideals and reality can be ascribed to the naivety of the technology pundits. But the real problem was that the internet’s early visionaries never translated their aspirations for a shared cyberspace into a set of concrete principles on which online regulation could be constructed. It’s as if they wanted to build an exemplary city on the hill, but never bothered to spell out how to keep it exemplary once it started growing.

The law of entropy took over from there. The allusion to a city on a hill recalls the Puritan experiment in pious self-government that never quite managed to pass on its vision to the next generation. Pursuing the analogy, Morozov fills the role of the preachers who evolved the jeremiad – a genre of sermon identified by historian Perry Miller that denounced the community’s departure from its founding principles and called for repentance and renewal. Morozov’s commentary fits the genre neatly, and that is not at all to detract from the value of his critique.

The connection to the Puritans is worth pursuing even further, but we have to see beyond the typical tropes with which they are associated, those that lend color to the term puritanical, for just a moment. The Puritans loom large in most tellings of early American history and their influence has long been both celebrated and lamented. In one respect, though, they are out of step with the subsequent evolution of American culture. The rugged individualism that came to dominate the American psyche would have been an unwelcome anomaly within the deeply communitarian ethos of the early Puritan settlers. John Winthrop’s sermon aboard the Arbella in which we find the “city on a hill” imagery is fundamentally a communitarian tract urging self-restraint, self-sacrifice, mercy, justice, and generosity for the good of the community. Shared sacrifice, shared risk, shared resources, shared lives – out of such was community forged.

With this in mind we can begin to sound out a tension within Morozov’s account. In his view, personalization and the loss of privacy are among the chief temptations that ultimately lead to the fall of the digital community. So for example, he notes with displeasure: “The logical end of this ever-increasing personalisation is of each user having his or her own online experience. This is a far cry from the early vision of the internet as a communal space.” And further on we read that, “For many internet users, empowerment was an illusion. They may think they enjoy free access to cool services, but in reality, they are paying for that access with their privacy.”

Yet, certain constructions of privacy and an aversion to personalization sit uneasily alongside of a communitarian ideal. Pressed to their extremes privacy and depersonalization converge in anonymity, and it was, in fact, the anonymity of the early Internet that thrilled theorist with the possibilities for experimentation with identity and its construction. Unfortunately, the early digital communitarian ideal was tied to this vision of privacy/anonymity, and you are not likely to have anything like a community in any strong sense on those terms, at least not in a way that answers to the human social impulse. Social media has provided something like an experience of community precisely because it has been tied to personalization (all of its attendant problems notwithstanding).

Maybe the real problem with the early internet was its commitment to anonymity. Personalization, not regulation, may have curbed the sorts of behaviors and developments that Morozov laments. The unbridled pursuit of self-interest which is always the enemy of community, virtual or otherwise, is abetted by the lack of accountability engendered by anonymity. This is a lesson at least as old as the Ring of Gyges story told by Plato. Community, and the fully human life it enables, is on the other hand built upon the risk of self-disclosure. A point not lost on Hannah Arendt when she noted that, “To live an entirely private life means above all to be deprived of things essential to a truly human life: to be deprived of the reality that comes from being seen and heard by others.” And, she later adds, “Action without a name, a ‘who’ attached to it, is meaningless …”

Admittedly, community is an amorphous and abstract concept and the pursuit of its virtual analogue may be finally incoherent. What’s more, anonymity is in certain circumstances clearly desirable, and commodified personalization is undoubtedly problematic. Finally, there are clearly important and necessary forms of privacy which must be protected. With these qualifications noted, it remains the case that any kind of online community, if it is to serve as a mediating structure that allows for social interaction on a scale somewhere between the anonymity of total seclusion and that of mass society, needs to be built on some degree of measured self-disclosure and consistency of identity.

This entails all sorts of risks and even sacrifices which means that what we may need is less of a jeremiad and more Winthrop-esque vision setting.

_________________________________________________________

Article first published as Virtual Communities, Anonymity, and the Risks of Self-Disclosure on Blogcritics.

The Triumph of the Social

“Social” is big, in case you missed it.

The most noticeable and  significant development in our media environment over the last decade or so has been the emergence of social elements across digital media platforms and the subsequent migration of social media into traditional media fields.  We might, to borrow a phrase, call this the triumph of the social.   Whatever we call it, this development marks a significant departure from the trend toward individualism that characterized the modern era.

Modernity, according to the standard storyline, was characterized by the individual’s liberation from the constraints of place, tradition, institutions, and, to some degree, biology.  The Renaissance, the Reformation, the Enlightenment — each is an episode in the rise of the individual.  Protestantism, democracy, capitalism — each features the individual and his soul, his rights, his property prominently.  From this angle, postmodernity is, in fact, hyper-modernity; it is not a break with the trajectory of modernity with respect to identity, it is its consummation.   The individual is liberated even from the notion of the persistent or essential self.  Identity, according to the usual suspect theorists, is constructed all the way down (except, of course, that it is not).

I can reasonably follow this story through to the early Internet age, but then something changes. Social media reasserts the social self.  This could be read as a further development of the individualist trajectory – the liberated individual is simply given a larger stage from which to pursue the project of creating their identity free of constraints.  If so, then what we might analyze is the new mode of identity construction.  For example, we might note that we now perform our identity by sharing.  The “Like” button becomes the instrument of identity construction.  The bands, television shows, websites, products, news stories, movies, clothes, cars, companies, causes, etc. that we “Like” signal to our social network who we “are”.

Whatever truth there may be too that, and there is some, there is one other way to read the rise of the social.  In a recent blog post, sociologist Peter Berger offered some reflections on the July 2011 issue of The Annals of the American Academy of Political and Social Science devoted to the topic of “Patrimonial Power in the Modern World.”  In Berger’s summary, patrimonial power is “is power on the basis of kinship and other patron-client relationships. It is the most common form of political authority in traditional societies before the rise of centralized states and empires. Such authority is exercised by way of personal loyalties rather than formal rules. The tribal chief is the prototypical leader in patrimonial regimes.”  Patrimonial power functioned within traditional societies grounded in personal ties and relationships.

Modernity, on the other hand, is characterized by other forms of power and authority.  Berger continues:

The counter-type is the bureaucrat … In a patrimonial system one trusts the chief because he belongs to one’s tribe and embodies its tradition. In what Weber called a “legal-rational” system one trusts the bureaucrat because he occupies an office established by proper procedures; indeed one trusts these procedures rather than the particular individual they have placed in the office.

Bureaucracy, however, not only abstracts power, it abstracts the personal.  In a bureaucracy the person is reduced to a number or an algorithm.  So one of the ironies of modernity is that the rise of the individual was accompanied by the rise of institutions that de-faced the individual.

What’s more, the rise of the individual throughout the modern period was coupled with the simultaneous rise of modern notions of privacy.  The extreme end of the privacy spectrum is complete anonymity, and here too the individual is de-faced, left without personal connection.

Just to be clear, this move toward individualism and privacy was not all bad.  In many respects it was a healthy corrective.  But the pendulum may have swung too far, and Berger does a nice job of explaining where the problem lies:

Robert Musil, in his great novel The Man Without Qualities, recounts an incident when Ulrich, the central character, is arrested and processed in a police station. He experiences what he calls a “statistical disaggregation” of his person. He is reduced to the minimal characteristics relevant to the police investigation, while all the characteristics essential to his self-esteem are ignored. In one way or another, we experience something like this depersonalization in many situations. We are abstract objects of the juridical system in court, abstract patients in a hospital,  abstract consumers in the marketplace. Everything we cherish most about ourselves is strictly irrelevant—our intellectual achievements, our sense of humor or capacity for affection, not to mention the prerogatives of age. In such situations we instinctively reach out for “tribal” connections—for someone who knows who we are, with whom we share an ethnic or religious identity, or even someone who laughs at a joke we tell: in sum, someone who recognizes us in a personal way.

Berger goes on to recall the term he coined with Richard Neuhaus, “mediating structures,” to describe neighborhood, family, church and voluntary associations.  These mediating structures buffered the individual from the impersonal, bureaucratic power of the state, but these structures have themselves been severely compromised leaving the individual isolated and disconnected.  This state of affairs, along with the the presumption that we are indeed political, which is to say social, animals helps explain, in part at least, the triumph of the social.  Social media functions as a mediating structure, a realm in which we are addressed by name and find our individual self publicly acknowledged.

This is not the whole story, of course. Social media in its own way also reduces us to numbers or algorithms, and it cannot provide the all that traditional mediating structures, at their best, are able to offer.  There are also temptations to narcissism and worse.  But, the risks notwithstanding, social media owes its success to the way it addresses a fundamental dimension of being human.

______________________________________________________

Related post:  The (Un)Naturalness of Privacy.

There Can Be Only One: Google+ Takes On Facebook

You are most likely not one of the favored few who have been invited to take Google’s new social networking platform out for a spin and neither am I,  but now we get a glimpse of what Google has been up to. When it does go live,  Google+ will open a new front in the battle against Facebook, and one that appears more promising than the ill-fated Google Buzz.

The Google+ experience is in large measure reminiscent of Facebook with at least one major exception:  Circles.   Facebook’s glaring weakness is its insistence that you indiscriminately present the same persona to every one of your “friends,” a list which may include your best friend from childhood, your ex-girlfriend, your boss, your co-workers, your grandmother, and that kid that lived down the street when you were growing up.  Amusingly, Zuckerberg turns moral philosopher on this point and declares that maintaining more than one online identity signals a lack of integrity.  Google+ more sensibly assumes that not all human relationships are created equal and that our social media experience ought to acknowledge that reality.  It allows you to create Circles into which you can drag and drop names from your contact list.  Whenever you post a picture or a link or a comment, you may designate which Circles will be able to see what you have posted. In other words, it lets you effectively manage the presentation of your self to your multifaceted social media audience.  Google+ appears to have thus solved Facebook’s George Costanza problem:  colliding worlds.

Facebook has gestured in this direction with Groups and Friend lists, but this remains an awkward experience, perhaps because it is at odds with the logic at the core of Facebook’s DNA.  Google+, having taken note of the rumblings of discontent with Facebook’s at times cavalier attitude toward privacy,  also allows users to permanently delete their information from Google’s servers and otherwise presents a more privacy-friendly front.

Even with these features aimed at exposing Facebook’s weaknesses and recent news about kinks in Facebook’s armor, Google+ is not expected to challenge Facebook’s social media supremacy.  Inertia is the main obstacle to the success of Google+.  Many users have committed an immense amount of data to their Facebook profiles and Facebook has worked hard to integrate itself into the whole online experience of its users.  Additionally, Facebook has more or less become a memory archive for many of its users and we don’t easily part with our memory.  Most significantly, perhaps, Google+ starts from a position of relative weakness as far as social media platforms are concerned — it has few users.  Most people will, for a long time to come, more readily find those they know on Facebook.

That said, Facebook’s early success against Myspace was predicated on a certain exclusivity.  It may be that an early disadvantage — relatively few members — may present itself as an important advantage in the eyes of enough people to generate momentum for Google+.  It is also hard to tell how many would-be social media users have been kept at bay by Facebook’s shortcoming and will now venture into social media waters given the refinements offered by Google+.  Casual Facebook users may also find it relatively painless to make a move.

Hard to tell from here, as is most of the future, but I wouldn’t be too surprised if Google+ significantly eroded Facebook’s base. Despite, my Highlander-esque title, the most likely outcome may be that both platforms co-exist by appealing to different sets of sensibilities, priorities, and expectations.