Mark Zuckerberg, Moral Philosopher of Identity

In a recent blog post, Steve Cheney bemoans the ongoing progress that Facebook is making toward becoming the ambient background of the Internet.  Specifically, he is concerned that Facebook is killing your authenticity:

… now Facebook’s sheer scale is pushing it in a new direction, one that encroaches on your authenticity.

Facebook is no longer a social network. They stopped being one long before the movie. Facebook is really a huge broadcast platform. Everything that happens between its walls is one degree away from being public, one massive auditorium filled with everyone you’ve ever met, most of whom you haven’t seen or spoken to in years.

Cheney’s post was triggered by the recent adoption of Facebook commenting by a number of large websites, a move that builds on the earlier integration of the “Like” button into almost every commercial, news, and entertainment site of note as part of Facebook’s “Open Graph” platform.  The trajectory here seems fairly clear.  Facebook is forging a global internet identity for you, one that it owns, of course, and with which it stands to make a fair bit of money.

Helpfully, Cheney did not frame his complaint within a denial of the basically social nature of human beings along the lines suggested by Andrew Keen not too long ago.  On the contrary, Cheney acknowledges our social impulses and is concerned that one singular online identity will not do justice to the complexity of human personality and truly social interaction.  One indiscriminate identity will result in one inauthentic and shallow identity that will inhibit rather than promote meaningful sociability.

“A George divided against itself cannot stand!”

The George in question is, of course, the character of George Costanza on Seinfeld.  In one of the more memorable exchanges from the remarkably memorable series, George explains what would happen if Relationship George were to come into contact with Independent George – Independent George would be no more.  We can relate to George in this situation because most of us maintain a handful of different personas that we cycle through as we navigate our way through life.  There are elements of our personality we reveal in some settings that we do not disclose in others; we present some aspects of our selves to certain people and not to others.  When for some reason these roles come into contact with one another it is possible that a little tension and confusion may ensue.  No news here.

In the early days of the Internet, when a kind of felicitous anarchy seemed to reign, it was fashionable to view the anonymity of the web as a playhouse of identity.  Individuals were able to try on and experiment with all sorts of identities — for better or for ill —  with relative safety and little worry of being found out.  It would have been unthinkable that one single and fully transparent identity would mark us across our Internet experience.

But that is exactly the trajectory we have been on for the last several years and this increases the odds of our many worlds colliding occasionally leading us to experience the kind of existential crisis that George’s histrionics embodied.  When our worlds collide, we too begin to sense that we might be losing our independent self, or the ability to control what people see and hear of us, control of what we might call our public identities.  We have a more difficult time calibrating our public personas to fit specific audiences and tasks.

Take for example the awkwardness and angst that arose when parents began joining Facebook and attempted to “friend” their children.  A Washington Post story on the topic from September 2008 cited protest groups formed in response with less than subtle names such as “What Happens in College Stays in College: Keep Parents Off Facebook!”  The author noted that it might seem odd that a “generation accustomed to sharing everything online” and with little or no apparent awareness of the distinction between private and public becomes apoplectic when merely two more people gain access to their already remarkably public personas.  But this misses the point.  What was at stake, of course, was control over who knew what.  The students experienced exactly what George did – their worlds collided and their anxiety reflected the increasing difficulty of controlling their public identity.

The ubiquity of one dominant social media platform makes it harder to exercise effective control over the presentation of our identities.  Mark Zuckerberg, moral philosopher that he is, rather conveniently believes,

You have one identity… The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly… Having two identities for yourself is an example of a lack of integrity.

Facebook’s near monopoly on social networking has reigned in the proliferation of profiles and, if fact, studies suggest that a Facebook profile tracks fairly closely to the truth about a person.  But there is still the question of who sees that more or less truthful public approximation of our personality and how much they see.  Furthermore, should Facebook, or any social media site be in the business of compelling people to live with integrity, particularly while profiting from the enforcement of this integrity?  More importantly, is it really integrity that is being forced upon us?  Or, to put it another way, does the maintenance of various personas necessarily entail a morally problematic lack of integrity? Is duplicity the only reason why we would withhold some aspect of our personality in certain circumstances?

Authentic and meaningful relationships typically depend upon the natural evolution of interpersonal trust and confidence.  Demanding immediate and equal transparency across the board works against the natural progression of social interaction.  Pace Mr. Zuckerberg, there are good reasons why we don’t reveal ourselves in equal measure to everyone and in all circumstances that have nothing to do with a lack of integrity.

Information Overload and the Possibilities of Digital Asceticism

Sharon Begley’s Newsweek piece, “I Can’t Think,” tackles the problem of information overload with the help of some recent neurological studies.  Perhaps not surprisingly, “With too much information, ” according to one researcher, “people’s decisions make less and less sense.”  Most of us are all too familiar with the mounting sense of indecision and even anxiety the more information we collect regarding an important decision, so this won’t come as too much of a surprise.  Here is one interesting note, however, analogous to the observations noted a couple of days ago about memory and creativity:

“If you let things come at you all the time, you can’t use additional information to make a creative leap or a wise judgment,” says Cantor. “You need to pull back from the constant influx and take a break.” That allows the brain to subconsciously integrate new information with existing knowledge and thereby make novel connections and see hidden patterns. In contrast, a constant focus on the new makes it harder for information to percolate just below conscious awareness, where it can combine in ways that spark smart decisions.

On the same topic, Nicholas Carr has offered a helpful distinction in a recent blog post.  The problem with information used to be not having enough of it and designing good filters to find the relevant stuff.  This is no longer the issue.

Situational overload is the needle-in-the-haystack problem: You need a particular piece of information – in order to answer a question of one sort or another – and that piece of information is buried in a bunch of other pieces of information. The challenge is to pinpoint the required information, to extract the needle from the haystack, and to do it as quickly as possible. Filters have always been pretty effective at solving the problem of situational overload …

Situational overload is not the problem. When we complain about information overload, what we’re usually complaining about is ambient overload. This is an altogether different beast. Ambient overload doesn’t involve needles in haystacks. It involves haystack-sized piles of needles. We experience ambient overload when we’re surrounded by so much information that is of immediate interest to us that we feel overwhelmed by the never ending pressure of trying to keep up with it all.

As is often noted, given the choice between the problems attending information scarcity and those attending information over-abundance, better to opt for the latter.  It may ultimately be difficult to argue with this point.  But problems are problems and so we feel their force and long for solutions.  Given that this is Ash Wednesday, one is tempted to suggest that perhaps what is needed are new personal practices of digital asceticism informed by our (evolving) understanding of the conditions under which the human mind and body best function and flourish.  These are some of the possible choices that may inform such a set of practices, at least as they come to my mind:

  • Intentionally aim for the temperate use of digital media
  • Allow for periods of silence
  • Seek digitally unmediated interactions with others
  • Accept that we cannot keep up with all of it
  • Acknowledge the goodness of certain limitations associated with embodiment
  • Practice separation from devices that make you anxious by their absence

More suggestions welcome.

The (Un)Naturalness of Privacy

Andrew Keen is not an Internet enthusiast, at least not since the emergence of Web 2.0.  That much has been clear since his 2007 book The Cult of the Amateur: How Today’s Internet is Killing Our Culture and his 2006 essay equating Web 2.0 with Marxism in The Weekly Standard, a publication in which such an equation is less than flattering.  More recently, Keen has taken on all things social in a Wired UK article, “Your Life Torn Open: Sharing is a Trap.”

Now, I’m all for a good critique and lampoon of the excesses of social media, really, but Keen may have allowed his disdain to get the better of him and veered into unwarranted, misanthropic excess.  From the closing paragraph:

Today’s digital social network is a trap. Today’s cult of the social, peddled by an unholy alliance of Silicon Valley entrepreneurs and communitarian idealists, is rooted in a misunderstanding of the human condition. The truth is that we aren’t naturally social beings. Instead, as Vermeer reminds us in The Woman in Blue, human happiness is really about being left alone. …. What if the digital revolution, because of its disregard for the right of individual privacy, becomes a new dark ages? And what if all that is left of individual privacy by the end of the 21st century exists in museums alongside Vermeer’s Woman in Blue? Then what?

“Human happiness is really about being left alone” and “The truth is that we aren’t naturally social beings”? Striking statements, and almost certainly wrong. It seems rather that the vast majority of human beings long for, and sometimes despair of never finding, meaningful and enduring relationships with other human beings. That human flourishing is conditioned on the right balance of the private and the social, individuality and relationship, seems closer to the mark. And while I suppose one could be raised by wolves in legends and stories, I’d like to know how infants would survive biologically in isolation from other human beings.  On this count, better stick with Aristotle.  The family, the clan, the tribe, the city — these are closer to the ‘natural’ units of human existence.

The most ironic aspect of these claims is Keen’s use of Vermeer’s “Woman in Blue” or, more precisely, “Woman in Blue Reading a Letter,” to illustrate them.  That she is reading a letter is germane to the point at issue here which is the naturalness of privacy.  Contrary to Keen’s assertion of the natural primacy of privacy, it is closer to the truth to correlate privacy with literacy, particularly silent reading (which has not always been the norm), and the advent of printing.  Changing socio-economic conditions also factor into the rise of modern notions of privacy and the individual.  Notions formalized by Locke and Hobbes who enshrine the  atomized individual as the foundation of society, notably, with founding myths which are entirely a-historical.

In The Vineyard of the Text, Ivan Illich, citing George Steiner, suggests this mutual complicity of reading and privacy:

According to Steiner, to belong to ‘the age of the book’ meant to own the means of reading.  The book was a domestic object; it was accessible at will for re-reading.  The age presupposed private space and the recognition of the right to periods of silence, as well as the existence of echo-chambers such as journals, academies, or coffee circles.

Likewise, Walter Ong, drawing on Eric Havelock, explains that

By separating the knower from the known, writing makes possible increasingly articulate introspectivity, opening the psyche as never before not only to the external objective world quite distinct from itself but also to the interior self agaisnt whom the objective world is set.

Privacy emerges from the dynamics of literacy.  The more widespread literacy becomes, as for example with the printing press, the more widespread and normalized the modern sense of privacy becomes.  What Keen is bemoaning is the collapse of the experience privacy wrought by print culture.  I do think there is something to mourn there, but to speak of its “naturalness” misconstrues the situation and seems to beget a rather sociopathic view human nature.

Finally, it is also telling that Vermeer’s woman is reading a letter.  Letters, after all, are a social genre; letter writing is a form of social life.  To be sure, it is a very different form of social life than what the social web offers, but it is social.  And were we not social beings we would not, as Auden puts its, “count some days and long for certain letters.” The Woman in Blue Reading a Letter reminds us that privacy is bound to literate culture and human beings are bound to one another.

_____________________________________________________________

Updates:  To reinforce that it is a balance we are after, also consider this article in yesterday’s Boston Globe, “The Power of Lonely.” Excerpt:

But an emerging body of research is suggesting that spending time alone, if done right, can be good for us — that certain tasks and thought processes are best carried out without anyone else around, and that even the most socially motivated among us should regularly be taking time to ourselves if we want to have fully developed personalities, and be capable of focus and creative thinking.

My attention has also been drawn to an upcoming documentary for which Andrew Keen was interviewed.  PressPausePlay will be premiering at South By Southwest and addresses the issues of creativity and art in the digital world.  Here’s  a preview featuring Andrew Keen:

‘The Connecting Is the Thinking’: Memory and Creativity

Last summer Nicholas Carr published The Shallows: What the Internet Is Doing to Our Brains, a book length extension of his 2008 Atlantic essay, “Is Google Making Us Stupid?” The book received a good bit of attention and was in the ensuing weeks reviewed seemingly everywhere.  We noted a few of those reviews here and here.  Coming in fashionably late to the show, Jim Holt has written a lenghty review in the London Review of Books titled, “Smarter, Happier, More Productive.” Perhaps a little bit of distance is helpful.

Holt’s review ends up being one of the better summaries of Carr’s book that I have read, if only because Holt details more of the argument than most reviews.  In the end, he tends to think that Carr is stretching the evidence and overstating his case on two fronts, intelligence and happiness. However, he is less sanguine on one last point, creativity, and that in relation to memory.

Holt cites two well known writers who are optimistic about off-loading their memories to the Internet:

This raises a prospect that has exhilarated many of the digerati. Perhaps the internet can serve not merely as a supplement to memory, but as a replacement for it. ‘I’ve almost given up making an effort to remember anything,’ says Clive Thompson, a writer for Wired, ‘because I can instantly retrieve the information online.’ David Brooks, a New York Times columnist, writes: ‘I had thought that the magic of the information age was that it allowed us to know more, but then I realised the magic of the information age is that it allows us to know less. It provides us with external cognitive servants – silicon memory systems, collaborative online filters, consumer preference algorithms and networked knowledge. We can burden these servants and liberate ourselves.’

But as Holt notes, “The idea that machine might supplant Mnemosyne is abhorrent to Carr, and he devotes the most interesting portions of his book to combatting it.”  Why not outsource our memory?

Carr responds with a bit of rhetorical bluster. ‘The web’s connections are not our connections,’ he writes. ‘When we outsource our memory to a machine, we also outsource a very important part of our intellect and even our identity.’ Then he quotes William James, who in 1892 in a lecture on memory declared: ‘The connecting is the thinking.’ And James was onto something: the role of memory in thinking, and in creativity.

Holt goes on to supplement Carr’s discussion with an anecdote about the polymathic French mathematician, Henri Poincare.  What makes Poincare’s case instructive is that “his breakthroughs tended to come in moments of sudden illumination.”

Poincaré had been struggling for some weeks with a deep issue in pure mathematics when he was obliged, in his capacity as mine inspector, to make a geological excursion. ‘The changes of travel made me forget my mathematical work,’ he recounted.

“Having reached Coutances, we entered an omnibus to go some place or other. At the moment I put my foot on the step the idea came to me, without anything in my former thoughts seeming to have paved the way for it, that the transformations I had used to define the Fuchsian functions were identical with those of non-Euclidean geometry. I did not verify the idea; I should not have had time, as, upon taking my seat in the omnibus, I went on with a conversation already commenced, but I felt a perfect certainty. On my return to Caen, for conscience’s sake, I verified the result at my leisure.”

How to account for the full-blown epiphany that struck Poincaré in the instant that his foot touched the step of the bus? His own conjecture was that it had arisen from unconscious activity in his memory.

This leads Holt to suggest, following Poincare, that bursts of creativity and insight arise from the unconscious work of memory, and that this is the difference between internalized and externalized memory.  We may be able to retrieve at will whatever random piece of information we are looking for with a quick Google search, but that seems not to approach the power of the human mind to creatively and imaginatively work with its stores of memory.  Holt concludes:

It is the connection between memory and creativity, perhaps, which should make us most wary of the web. ‘As our use of the web makes it harder for us to lock information into our biological memory, we’re forced to rely more and more on the net’s capacious and easily searchable artificial memory,’ Carr observes. But conscious manipulation of externally stored information is not enough to yield the deepest of creative breakthroughs: this is what the example of Poincaré suggests. Human memory, unlike machine memory, is dynamic. Through some process we only crudely understand – Poincaré himself saw it as the collision and locking together of ideas into stable combinations – novel patterns are unconsciously detected, novel analogies discovered. And this is the process that Google, by seducing us into using it as a memory prosthesis, threatens to subvert.

And this leads me to make one additional observation.  As I’ve mentioned before, it is customary in these discussions to refer back to Plato’s Phaedrus in which Socrates warns that writing, as an externalization of memory, will actually lead to the diminishing of human memory.  Holt mentions the passage in his review and Carr mentions it as well.  When the dialog is trotted out it is usually as a “straw man”  to prove that concerns about new technologies are silly and misguided.  But it seems to me that there is a silent equivocation that slips into these discussions: the notion of memory we tend to assume is our current understanding of memory that is increasingly defined by the comparison to computer memory which is essentially storage.

It seems to me that having first identified a computer’s storage  capacity as “memory,” a metaphor dependent upon the human capacity we call “memory,” we have now come to reverse the direction of the metaphor by understanding human “memory” in light of a computer’s storage capacity.  In other words we’ve reduced our understanding of memory to mere storage of information.  And now we read all discussions of memory in light of this reductive understanding.

Given this reductive view of memory, it seems silly for Socrates (and by extension, Plato) to worry about the externalization of memory, whether it is stored inside or outside, what difference does it make as long as we can access it?  And, in fact, access becomes the problem that attends all externalized memories from the book to the Internet.  But what if memory is not mere storage?  Few seem to extend their analysis to account for the metaphysical role memory of the world of forms played within Plato’s account of the human person and true knowledge.  We may not take Plato’s metaphysics at face value, but we can’t really understand his concerns about memory without understanding their lager intellectual context.

Holt helps us to see the impoverishment of our understanding of memory from another, less metaphysically freighted, perspective.  The Poincare anecdote in its own way also challenges the reduction of memory to mere storage, linking it with the complex workings of creativity and insight.  Others have similarly linked memory to identity, wisdom, and even, in St. Augustine’s account, our understanding of the divine.  Whether one veers into the theological or not, the reduction of memory to mere storage of data should strike us as an inadequate account of memory and its significance and cause us to rethink our readiness to offload it.

Update:  Post from Carr on the issue of memory including a relevant excerpt from The Shallows, “Killing Mnemosyne.”