The (Un)Naturalness of Privacy

Andrew Keen is not an Internet enthusiast, at least not since the emergence of Web 2.0.  That much has been clear since his 2007 book The Cult of the Amateur: How Today’s Internet is Killing Our Culture and his 2006 essay equating Web 2.0 with Marxism in The Weekly Standard, a publication in which such an equation is less than flattering.  More recently, Keen has taken on all things social in a Wired UK article, “Your Life Torn Open: Sharing is a Trap.”

Now, I’m all for a good critique and lampoon of the excesses of social media, really, but Keen may have allowed his disdain to get the better of him and veered into unwarranted, misanthropic excess.  From the closing paragraph:

Today’s digital social network is a trap. Today’s cult of the social, peddled by an unholy alliance of Silicon Valley entrepreneurs and communitarian idealists, is rooted in a misunderstanding of the human condition. The truth is that we aren’t naturally social beings. Instead, as Vermeer reminds us in The Woman in Blue, human happiness is really about being left alone. …. What if the digital revolution, because of its disregard for the right of individual privacy, becomes a new dark ages? And what if all that is left of individual privacy by the end of the 21st century exists in museums alongside Vermeer’s Woman in Blue? Then what?

“Human happiness is really about being left alone” and “The truth is that we aren’t naturally social beings”? Striking statements, and almost certainly wrong. It seems rather that the vast majority of human beings long for, and sometimes despair of never finding, meaningful and enduring relationships with other human beings. That human flourishing is conditioned on the right balance of the private and the social, individuality and relationship, seems closer to the mark. And while I suppose one could be raised by wolves in legends and stories, I’d like to know how infants would survive biologically in isolation from other human beings.  On this count, better stick with Aristotle.  The family, the clan, the tribe, the city — these are closer to the ‘natural’ units of human existence.

The most ironic aspect of these claims is Keen’s use of Vermeer’s “Woman in Blue” or, more precisely, “Woman in Blue Reading a Letter,” to illustrate them.  That she is reading a letter is germane to the point at issue here which is the naturalness of privacy.  Contrary to Keen’s assertion of the natural primacy of privacy, it is closer to the truth to correlate privacy with literacy, particularly silent reading (which has not always been the norm), and the advent of printing.  Changing socio-economic conditions also factor into the rise of modern notions of privacy and the individual.  Notions formalized by Locke and Hobbes who enshrine the  atomized individual as the foundation of society, notably, with founding myths which are entirely a-historical.

In The Vineyard of the Text, Ivan Illich, citing George Steiner, suggests this mutual complicity of reading and privacy:

According to Steiner, to belong to ‘the age of the book’ meant to own the means of reading.  The book was a domestic object; it was accessible at will for re-reading.  The age presupposed private space and the recognition of the right to periods of silence, as well as the existence of echo-chambers such as journals, academies, or coffee circles.

Likewise, Walter Ong, drawing on Eric Havelock, explains that

By separating the knower from the known, writing makes possible increasingly articulate introspectivity, opening the psyche as never before not only to the external objective world quite distinct from itself but also to the interior self agaisnt whom the objective world is set.

Privacy emerges from the dynamics of literacy.  The more widespread literacy becomes, as for example with the printing press, the more widespread and normalized the modern sense of privacy becomes.  What Keen is bemoaning is the collapse of the experience privacy wrought by print culture.  I do think there is something to mourn there, but to speak of its “naturalness” misconstrues the situation and seems to beget a rather sociopathic view human nature.

Finally, it is also telling that Vermeer’s woman is reading a letter.  Letters, after all, are a social genre; letter writing is a form of social life.  To be sure, it is a very different form of social life than what the social web offers, but it is social.  And were we not social beings we would not, as Auden puts its, “count some days and long for certain letters.” The Woman in Blue Reading a Letter reminds us that privacy is bound to literate culture and human beings are bound to one another.

_____________________________________________________________

Updates:  To reinforce that it is a balance we are after, also consider this article in yesterday’s Boston Globe, “The Power of Lonely.” Excerpt:

But an emerging body of research is suggesting that spending time alone, if done right, can be good for us — that certain tasks and thought processes are best carried out without anyone else around, and that even the most socially motivated among us should regularly be taking time to ourselves if we want to have fully developed personalities, and be capable of focus and creative thinking.

My attention has also been drawn to an upcoming documentary for which Andrew Keen was interviewed.  PressPausePlay will be premiering at South By Southwest and addresses the issues of creativity and art in the digital world.  Here’s  a preview featuring Andrew Keen:

‘The Connecting Is the Thinking’: Memory and Creativity

Last summer Nicholas Carr published The Shallows: What the Internet Is Doing to Our Brains, a book length extension of his 2008 Atlantic essay, “Is Google Making Us Stupid?” The book received a good bit of attention and was in the ensuing weeks reviewed seemingly everywhere.  We noted a few of those reviews here and here.  Coming in fashionably late to the show, Jim Holt has written a lenghty review in the London Review of Books titled, “Smarter, Happier, More Productive.” Perhaps a little bit of distance is helpful.

Holt’s review ends up being one of the better summaries of Carr’s book that I have read, if only because Holt details more of the argument than most reviews.  In the end, he tends to think that Carr is stretching the evidence and overstating his case on two fronts, intelligence and happiness. However, he is less sanguine on one last point, creativity, and that in relation to memory.

Holt cites two well known writers who are optimistic about off-loading their memories to the Internet:

This raises a prospect that has exhilarated many of the digerati. Perhaps the internet can serve not merely as a supplement to memory, but as a replacement for it. ‘I’ve almost given up making an effort to remember anything,’ says Clive Thompson, a writer for Wired, ‘because I can instantly retrieve the information online.’ David Brooks, a New York Times columnist, writes: ‘I had thought that the magic of the information age was that it allowed us to know more, but then I realised the magic of the information age is that it allows us to know less. It provides us with external cognitive servants – silicon memory systems, collaborative online filters, consumer preference algorithms and networked knowledge. We can burden these servants and liberate ourselves.’

But as Holt notes, “The idea that machine might supplant Mnemosyne is abhorrent to Carr, and he devotes the most interesting portions of his book to combatting it.”  Why not outsource our memory?

Carr responds with a bit of rhetorical bluster. ‘The web’s connections are not our connections,’ he writes. ‘When we outsource our memory to a machine, we also outsource a very important part of our intellect and even our identity.’ Then he quotes William James, who in 1892 in a lecture on memory declared: ‘The connecting is the thinking.’ And James was onto something: the role of memory in thinking, and in creativity.

Holt goes on to supplement Carr’s discussion with an anecdote about the polymathic French mathematician, Henri Poincare.  What makes Poincare’s case instructive is that “his breakthroughs tended to come in moments of sudden illumination.”

Poincaré had been struggling for some weeks with a deep issue in pure mathematics when he was obliged, in his capacity as mine inspector, to make a geological excursion. ‘The changes of travel made me forget my mathematical work,’ he recounted.

“Having reached Coutances, we entered an omnibus to go some place or other. At the moment I put my foot on the step the idea came to me, without anything in my former thoughts seeming to have paved the way for it, that the transformations I had used to define the Fuchsian functions were identical with those of non-Euclidean geometry. I did not verify the idea; I should not have had time, as, upon taking my seat in the omnibus, I went on with a conversation already commenced, but I felt a perfect certainty. On my return to Caen, for conscience’s sake, I verified the result at my leisure.”

How to account for the full-blown epiphany that struck Poincaré in the instant that his foot touched the step of the bus? His own conjecture was that it had arisen from unconscious activity in his memory.

This leads Holt to suggest, following Poincare, that bursts of creativity and insight arise from the unconscious work of memory, and that this is the difference between internalized and externalized memory.  We may be able to retrieve at will whatever random piece of information we are looking for with a quick Google search, but that seems not to approach the power of the human mind to creatively and imaginatively work with its stores of memory.  Holt concludes:

It is the connection between memory and creativity, perhaps, which should make us most wary of the web. ‘As our use of the web makes it harder for us to lock information into our biological memory, we’re forced to rely more and more on the net’s capacious and easily searchable artificial memory,’ Carr observes. But conscious manipulation of externally stored information is not enough to yield the deepest of creative breakthroughs: this is what the example of Poincaré suggests. Human memory, unlike machine memory, is dynamic. Through some process we only crudely understand – Poincaré himself saw it as the collision and locking together of ideas into stable combinations – novel patterns are unconsciously detected, novel analogies discovered. And this is the process that Google, by seducing us into using it as a memory prosthesis, threatens to subvert.

And this leads me to make one additional observation.  As I’ve mentioned before, it is customary in these discussions to refer back to Plato’s Phaedrus in which Socrates warns that writing, as an externalization of memory, will actually lead to the diminishing of human memory.  Holt mentions the passage in his review and Carr mentions it as well.  When the dialog is trotted out it is usually as a “straw man”  to prove that concerns about new technologies are silly and misguided.  But it seems to me that there is a silent equivocation that slips into these discussions: the notion of memory we tend to assume is our current understanding of memory that is increasingly defined by the comparison to computer memory which is essentially storage.

It seems to me that having first identified a computer’s storage  capacity as “memory,” a metaphor dependent upon the human capacity we call “memory,” we have now come to reverse the direction of the metaphor by understanding human “memory” in light of a computer’s storage capacity.  In other words we’ve reduced our understanding of memory to mere storage of information.  And now we read all discussions of memory in light of this reductive understanding.

Given this reductive view of memory, it seems silly for Socrates (and by extension, Plato) to worry about the externalization of memory, whether it is stored inside or outside, what difference does it make as long as we can access it?  And, in fact, access becomes the problem that attends all externalized memories from the book to the Internet.  But what if memory is not mere storage?  Few seem to extend their analysis to account for the metaphysical role memory of the world of forms played within Plato’s account of the human person and true knowledge.  We may not take Plato’s metaphysics at face value, but we can’t really understand his concerns about memory without understanding their lager intellectual context.

Holt helps us to see the impoverishment of our understanding of memory from another, less metaphysically freighted, perspective.  The Poincare anecdote in its own way also challenges the reduction of memory to mere storage, linking it with the complex workings of creativity and insight.  Others have similarly linked memory to identity, wisdom, and even, in St. Augustine’s account, our understanding of the divine.  Whether one veers into the theological or not, the reduction of memory to mere storage of data should strike us as an inadequate account of memory and its significance and cause us to rethink our readiness to offload it.

Update:  Post from Carr on the issue of memory including a relevant excerpt from The Shallows, “Killing Mnemosyne.”

Information, A History

This is, as they say, the Information Age.  A days ago I posted a link to a graphic illustrating the exponential growth in data storage capacity, and before that I referenced the figures frequently cited to demonstrate the explosion of data and its trajectory.  But it may be fair to ask what we mean by “information” or “data” anyway.  James Gleik’s new book, The Information: A History, a Theory, a Flood, offers a historical frame of reference that helps us situate our present ideas about what exactly information is and what can be done with it.

Two recent reviews provide a helpful summary of the main themes.  Nicholas Carr’s review, “Drowning in Beeps,” is at The Daily Beast, and Freeman Dyson’s lengthier review, “How We Know,” is in The New York Review of Books.

At the NYRB blog Gleik also wrote a short, engaging post, “The Information Palace,” examining the history of Information via the entry for “information” in Oxford English Dictionary.  Here’s an excerpt:

It’s in the nineteenth century that we start to glimpse the modern sense of the word as a big category, a general thing encompassing news and facts, messages and announcements. The real turning point comes in 1948 when the Bell Labs mathematician and engineer Claude Shannon, in his landmark paper (later a book) “A Mathematical Theory of Communication,” made information, as the OED explains, “a mathematically defined quantity divorced from any concept of news or meaning …” We measure it in bits. We recognize it in words, sounds, images; we store it on the printed page and on polycarbonate discs engraved by lasers and in our genes. We are aware, more or less, that it defines our world.

“Darkness Gathers Around the Book”

“I read and I daydream …. My reading is thus a sort of impertinent absence.  Is reading an exercise in ubiquity?”  An initial, indeed initiatory, experience:  to read is to be elsewhere, where they are not, in another world; it is to constitute a secret scene, a place one can enter and leave when one wishes; to create dark corners into which no one can see within an existence subjected to technocratic transparency and that implacable light that, in Genet’s work, materializes the hell of social alienation.  Marguertie Duras has noted:  “Perhaps one always reads in the dark …. Reading depends on the obscurity of the night.  Even if one reads in broad daylight, outside, darkness gathers around the book.”

— From Michel de Certeau’s The Practice of Everyday Life, Chapter 12, “Reading as Poaching” (173).  The initial quote is from Guy Rosolato’s Essais sur le symbolique.

“It’s like, you know … the end of print disciplined speech?”

In “What Happens in Vagueness Stays in Vagueness,” Clark Whelton takes aim at what he calls “the linguistic virus that infected spoken language in the late twentieth century” — vagueness.  Here’s the opening example:

I recently watched a television program in which a woman described a baby squirrel that she had found in her yard. “And he was like, you know, ‘Helloooo, what are you looking at?’ and stuff, and I’m like, you know, ‘Can I, like, pick you up?,’ and he goes, like, ‘Brrrp brrrp brrrp,’ and I’m like, you know, ‘Whoa, that is so wow!’ ” She rambled on, speaking in self-quotations, sound effects, and other vocabulary substitutes, punctuating her sentences with facial tics and lateral eye shifts. All the while, however, she never said anything specific about her encounter with the squirrel.

In the mid-1980s, Mr. Whelton began noticing increasingly aberrant speech patterns in prospective interns for New York City mayor Edward Koch’s speech writing staff.  “Like,” “you know,” “like, you know,” along with non-committal interrogative tones particularly distressed Whelton.  He goes on to add,

Undergraduates … seemed to be shifting the burden of communication from speaker to listener. Ambiguity, evasion, and body language, such as air quotes—using fingers as quotation marks to indicate clichés—were transforming college English into a coded sign language in which speakers worked hard to avoid saying anything definite.

Whelton comes closest to the true nature of the situation here, but I think there is an important consideration that is missing.  I’m inclined to think that the sorts of language patterns Whelton criticizes reflect a reversion to language environments that are more oral in nature than they are literate (a situation that Walter Ong called secondary orality).

The cadences and syntax of “high,” “correct,” “proper,” etc. English are a product of writing in general and intensified by print; they are not a necessary function of spoken language itself which is ordinarily much more chaotic.  Writing is removed from the holistic context that helps give face-to-face communication its meaning.  To compensate writing must work hard to achieve clarity and precision since the words themselves bear the burden of conveying the whole of the meaning.  Oral communication can tolerate vagueness in words and syntax because it can rely on intonation, volume, inflection, and other non-verbal cues to supply meaning. As an experiment try transcribing anyone of your countless verbal exchanges and note the sometimes startling difference between spoken language and written language.

Where print monopolizes communication, the patterns of written speech begin to discipline spoken language. “Vague” talk then may be characteristic of those whose speech patterns, because they have been formed in a world in which print’s monopoly has been broken, have not been so disciplined by print literacy.

Interestingly, new media is often quite “print-ish,” that is text isolated from sound — emails, text messaging, Twitter, blogs, Facebook (although with images there) — and this has required the invention of a system of signs aimed at taming the inherent “vagueness” of written communication that is restricted in length and thus not given the freedom to compensate for the loss of non-verbal and auditory cues with precise syntax and copious language.  : )