“The storm is what we call progress”

Via Alan Jacobs at Text Patterns, I read the following excerpt from Arikia Millikan’s short piece “I Am a Cyborg and I Want My Google Implant Already” on The Atlantic’s web site:

By the time I finished elementary school, writing letters to communicate across great distances was an archaic practice. When I graduated middle school, pirating music on Napster was the norm; to purchase was a fool’s errand. At the beginning of high school, it still may have been standard practice to manually look up the answer to a burning question (or simply be content without knowing the answer). Internet connection speeds and search algorithms improved steadily over the next four years such that when I graduated in the class of 2004, having to wait longer than a minute to retrieve an answer was an unbearable annoyance and only happened on road trips or nature walks. The summer before my freshman year of college was the year the Facebook was released to a select 15 universities, and almost every single relationship formed in the subsequent four years was prefaced by a flood of intimate personal information.

Now, I am always connected to the Web. The rare exceptions to the rule cause excruciating anxiety. I work online. I play online. I have sex online. I sleep with my smartphone at the foot of my bed and wake up every few hours to check my email in my sleep (something I like to call dreamailing).

But it’s not enough connectivity. I crave an existence where batteries never die, wireless connections never fail, and the time between asking a question and having the answer is approximately zero. If I could be jacked in at every waking hour of the day, I would, and I think a lot of my peers would do the same. So Hal, please hurry up with that Google implant. We’re getting antsy.

Well, hard to beat honesty I suppose.  I did find it slightly ironic that the Google executive who is interviewed for this piece was named Hal.

Jacobs aptly titled his post “The saddest thing I have read in some time,” and he added simply, “There’s a name for this condition: Stockholm Syndrome.”  Well put, of course.

Perhaps it was reading that piece that prepared me to read Walter Benjamin’s IX Thesis on the Philosophy of History later on that day with a certain melancholy resonance:

A Klee painting named “Angelus Novus” shows an angel looking as though he is about to move away from something he is fixedly contemplating.  His eyes are staring, his mouth is open, his wings are spread.  This is how one pictures the angel of history.  His face is turned toward the past.  Where we perceive a chain of events, he sees one single catastrophe  which keeps piling wreckage upon wreckage and hurls it in front of his feet.  The angel would like to stay, awaken the dead, and make whole what has been smashed.  But a storm is blowing from Paradise; it has got caught in his wings with such violence that the angel can no longer close them.  This storm irresistibly propels him into the future to which his back is turned, while the pile of debris before him grows skyward.  The storm is what we call progress.

In any case, I tend to agree with Jacobs — it was rather sad.

When Words and Action Part Company

I’ve not been one to jump on the Malcolm Gladwell bandwagon; I can’t quite get past the disconcerting hair.  That said, his recent piece in The New Yorker, “Small Change:  Why the revolution will not be tweeted,” makes a compelling case for the limits of social media when it comes to generating social action.

Gladwell frames his piece as a study in contrasts.  He begins by recounting the evolution of the 1960 sit-in movement that began when four freshmen from North Carolina A & T sat down and ordered coffee at the lunch counter of the local Woolworth’s and refused to move when the waitress insisted, “We don’t serve Negroes here.”  Within days the protest grew and spread across state lines and tensions mounted.

Some seventy thousand students eventually took part. Thousands were arrested and untold thousands more radicalized. These events in the early sixties became a civil-rights war that engulfed the South for the rest of the decade—and it happened without e-mail, texting, Facebook, or Twitter.

Almost reflexively now, the devotees of social media power will trot out the Twitter-enabled 2009 Iranian protests as an example of what social media can do.  Gladwell, anticipating as much, quotes Mark Pfeifle, a former national-security adviser, who believes that, “Without Twitter the people of Iran would not have felt empowered and confident to stand up for freedom and democracy.”  Pfeifle went so far as to call for Twitter’s nomination for the Nobel Peace Prize.  That is a bit of a stretch one is inclined to believe, and Gladwell explains why:

In the Iranian case … the people tweeting about the demonstrations were almost all in the West. “It is time to get Twitter’s role in the events in Iran right,” Golnaz Esfandiari wrote, this past summer, in Foreign Policy. “Simply put: There was no Twitter Revolution inside Iran.” The cadre of prominent bloggers, like Andrew Sullivan, who championed the role of social media in Iran, Esfandiari continued, misunderstood the situation. “Western journalists who couldn’t reach—or didn’t bother reaching?—people on the ground in Iran simply scrolled through the English-language tweets post with tag #iranelection,” she wrote. “Through it all, no one seemed to wonder why people trying to coordinate protests in Iran would be writing in any language other than Farsi.”

You can read the Foreign Policy article by Esfandiari Gladwell, “Misreading Tehran:  The Twitter Devolution,” online.   Gladwell argues that social media is unable to promote significant and lasting social change because they foster weak rather than strong-tie relationships.  Promoting and achieving social change very often means coming up against entrenched cultural norms and standards that will not easily give way.  And as we know from the civil rights movement, the resistance is often violent.  As Gladwell reminds us,

. . . Within days of arriving in Mississippi, three [Freedom Summer Project] volunteers—Michael Schwerner, James Chaney, and Andrew Goodman—were kidnapped and killed, and, during the rest of the summer, thirty-seven black churches were set on fire and dozens of safe houses were bombed; volunteers were beaten, shot at, arrested, and trailed by pickup trucks full of armed men. A quarter of those in the program dropped out. Activism that challenges the status quo—that attacks deeply rooted problems—is not for the faint of heart.

A subsequent study of the participants in the Freedom Schools was conducted by Doug McAdam:

“All  of the applicants—participants and withdrawals alike—emerge as highly committed, articulate supporters of the goals and values of the summer program,” he concluded. What mattered more was an applicant’s degree of personal connection to the civil-rights movement . . . . [P]articipants were far more likely than dropouts to have close friends who were also going to Mississippi. High-risk activism, McAdam concluded, is a “strong-tie” phenomenon.

Gladwell also goes on to explain why hierarchy, another feature typically absent from social media activism, is indispensable to successful movements while taking some shots along the way at Clay Shirky’s much more optimistic view of social media outlined in Here Comes Everybody: The Power of Organizing Without Organizations.

Not suprisingly, Gladwell’s piece has been making the rounds online the past few days. In response to Gladwell, Jonah Lehrer posted “Weak Ties, Twitter and the Revolution” on his blog The Frontal Cortex.  Lehrer begins by granting, “These are all worthwhile and important points, and a necessary correction to the (over)hyping of Twitter and Facebook.”  But he believes Gladwell has erred in the other direction.  Basing his comments on Mark Granovetter’s 1973 paper, “The Strength of Weak Ties,” Lehrer concludes:

. . . I would quibble with Gladwell’s wholesale rejection of weak ties as a means of building a social movement. (I have some issues with Shirky, too.) It turns out that such distant relationships aren’t just useful for getting jobs or spreading trends or sharing information. According to Granovetter, they might also help us fight back against the Man, or at least the redevelopment agency.

Read the whole post to get the full argument and definitely read Lehrer’s excellent review of Shirky’s book linked in the quotation above.  Essentially Lehrer is offering a kind of middle ground between Shirky and Gladwell.  Since I tend toward mediating positions myself, I think he makes a valid point; but I do lean toward Gladwell’s end of the spectrum nonetheless.

Here, however, is one more angle on the issue:  perhaps the factors working against the potential of social media are not only inherent in the form itself, but also a condition of society that predates the arrival of digital media by generations.  In The Human Condition, Hannah Arendt argued that power, the kind of power to transform society that Gladwell has in view,

. . . is actualized only where word and deed have not parted company, where words are  not empty and deeds not brutal, where words are not used to veil intentions but to disclose realities, and deeds are not used to violate and destroy but to establish relations and create new realities.

Arendt made that claim in the late 1950’s and she argued that even then words and deeds had been drifting apart for some time.  I suspect that since then the chasm has yawned ever wider and that social media participates in and reinforces that disjunction.  It would be unfair, however, to single out social media since the problem extends to most forms of public discourse, of which social media is but one example.

In The Disenchantment of Secular Discourse, Steven D. Smith argues that

It is hardly an exaggeration to say that the very point of ‘public reason’ is to keep the public discourse shallow – to keep it from drowning in the perilous depths of questions about ‘the nature of the universe,’ or ‘the end and object of life,’ or other tenets of our comprehensive doctrines.

If Smith is right — you can read Stanley Fish’s review in the NY Times to get more of a feel for his argument — social media already operate within a context in which the habits of public discourse have undermined our ability to take words seriously.  To put it another way, the assumptions shaping our public discourse encourage the divorce of words and deeds by stripping our language of its appeal to the deeper moral and metaphysical resources necessary to compel social action.  We tend to get stuck in the analysis and pseudo-debate without ever getting to action. As Fish puts it:

While secular discourse, in the form of statistical analyses, controlled experiments and rational decision-trees, can yield banks of data that can then be subdivided and refined in more ways than we can count, it cannot tell us what that data means or what to do with it . . . . Once the world is no longer assumed to be informed by some presiding meaning or spirit (associated either with a theology or an undoubted philosophical first principle) . . . there is no way, says Smith, to look at it and answer normative questions, questions like “what are we supposed to do?” and “at the behest of who or what are we to do it?”

Combine this with Kierkegaard’s 19th century observations about the Press that now appear all the more applicable to the digital world.  Consider the following summary of Kierkegaard’s fears offered by Hubert Dreyfus in his little book On the Internet:

. . . the new massive distribution of desituated information was making every sort of information immediately available to anyone, thereby producing a desituated, detached spectator.  Thus, the new power of the press to disseminate information to everyone in a nation led its readers to transcend their local, personal involvement . . . . Kierkegaard saw that the public sphere was destined to become a detached world in which everyone had an opinion about and commented on all public matters without needing any first-hand experience and without having or wanting any responsibility.

Kierkegaard suggested the following motto for the press:

Here men are demoralized in the shortest possible time on the largest possible scale, at the cheapest possible price.

I’ll let you decide whether or not that motto may be applied even more aptly to existing media conditions.  In any case, the situation Kierkegaard believed was created by the daily print press in his own day is at least a more likely possibility today.  A globally connected communications environment geared toward creating a constant, instantaneous, and indiscriminate flow of information, together with the assumptions of public discourse described by Smith, numbs us into docile indifference — an indifference social media may be powerless to overthrow, particularly when the stakes are high.  We are offered instead the illusion of action and involvement, the sense of participation in the debate.  But there is no meaningful debate, and by next week the issue, whatever the issue is, will still be there, and we’ll be busy discussing the next thing.  Meanwhile action walks further down a lonely path, long since parted from words.

Drowning in the Shallow End

As George Lakoff and Mark Johnson pointed out in Metaphors We Live By, we do a lot of our thinking and understanding through metaphors that structure our thoughts and concepts.  So pervasive are these metaphors, that in most cases we don’t even realize we are using metaphors at all.  Recently, metaphors related to shallowness and depth have caught my attention.

Many of the fears expressed by critics of the Internet and the digital world revolve around a loss of depth.  We are, in their view, gaining an immense amount of breadth or surface area, but it is coming at the expense of depth and by extension rendering us rather shallow.  For example, consider this passage from a brief statement playwright Richard Foreman contributed to Edge:

… today, I see within us all (myself included) the replacement of complex inner density with a new kind of self-evolving under the pressure of information overload and the technology of the “instantly available”. A new self that needs to contain less and less of an inner repertory of dense cultural inheritance—as we all become “pancake people”—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.

The notion of “pancake people” is a variation on the shallow/deep metaphor — a good deal of surface area, not much depth.  I first came across Foreman’s analogy in the conclusion of Nicholas Carr’s much discussed piece in The Atlantic, “Is Google Making Us Stupid.” Carr’s piece generated not only a lot of discussion, but also a book published this year exploring the effects of the Internet on the brain.  Carr’s book explored a variety of recent studies suggesting that significant Internet use was inhibiting our capacity for sustained attention and our ability to think deeply.  The title of Carr’s book?  The Shallows.

What is interesting about metaphors such as deep/shallow is that we do appear to have a rather intuitive sense of what they are communicating.  I suspect we all have some notion of what it means to say that someone or some idea is not very deep, or what is meant when some one says that they are just skimming the surface of a topic.  But the nature of metaphors is such that they both hide and reveal.  They help us understand a concept by comparing it to some other, perhaps more familiar idea, but the two things are never identical and so while something is illuminated, something may be hidden.  Also, the taken for granted status of some metaphors, shallowness/depth for instance, may also lull us into thinking that we understand something when we really don’t, in the same way, for example, that St. Augustine remarked that he knew what “time” was until he was asked to define it.

What exactly is it to say that an idea is shallow or deep?  Can we describe what we mean without resorting to metaphor? It is not that I am against metaphors at all, one can’t really be against metaphorical language without losing language as we know it altogether.  It may be that we cannot get at some ideas at all without metaphor.  My point rather is to try to think … well, more deeply about the consequences of our digital world.  Having noticed that key criticisms frequently involve this idea of a loss of depth, it seems that we better be sure we know what is meant.  Very often discussions and debates don’t seem to get anywhere because the participants are using terms equivocally or without a precise sense of how they are being used by the other side.  A little sorting out of our terms, perhaps especially our metaphors, may go a long way toward advancing the conversation.  (Incidentally, that last phrase is also a metaphor.)

Here is one last instance of the metaphor that doesn’t arise out of the recent debates about the Internet, and yet appears to be quite applicable.  The following is taken from Hannah Arendt’s 1958 work, The Human Condition:

A life spent entirely in public, in the presence of others, becomes, as we would say, shallow.  While it retains visibility, it loses the quality of rising into sight from some darker ground which must remain hidden if it is not to lose its depth in a very real, non-subjective sense.

Arendt’s comments arise from a technical and complex discussion of what she identifies as the private, public, and social realms of human life.  And while she was rather prescient in certain areas, she could not have imagined the rise of the Internet and social media.  However, these comments seem to be very much in line with Jaron Lanier’s observation, that “you have to be somebody before you can share yourself.” In our rush to publicize our selves and our thoughts, we are losing the hidden and private space in which we cultivate depth and substance.

Although employing other metaphors to do so, Richard Foreman also offered a sense of what he understood to be the contrast to the “pancake people”:

I come from a tradition of Western culture in which the ideal (my ideal) was the complex, dense and “cathedral-like” structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West.

This is not necessarily about the recovery of some Romantic notion of the essential self, but it is about a certain degree of complexity and solidity (metaphor’s again, I know).  In any case, it strikes me as an ideal worth preserving.  Foreman and Carr (and perhaps Arendt if she were around) seem uncertain that it is an ideal that can survive in the digital age.  At the very least, they are pointing to some of the challenges.  Given that the digital age is not going away, it is left to us, if we value the ideal, to think of how complexity, depth, and density can be preserved.  And the first thing we may have to do is bring some conceptual clarity to our metaphors.

Techno-Literacy, Digital Classrooms, Curiosity Killers, and More

This year’s NY Times Magazine Education Issue is out and it is devoted to a topic we’ve given a good deal of attention to here — technology in the classroom.  Below are a few of the highlights.  If you click through to read the whole articles you may be prompted to register with the Times’ website, but it is quick and free.

In “Achieving Techno-Literacy” by Kevin Kelly we get a brief glimpse at a family that decided to home school their child for one year before he entered high school.  Kelly notes that one of the surprises they encountered was “that the fancy technology supposedly crucial to an up-to-the-minute education was not a major factor in its success.”  There were technologies involved, of course, like a homemade bow to make fire and more recent varieties as well.  Yet, Kelly explains that,

… the computer was only one tool of many. Technology helped us learn, but it was not the medium of learning. It was summoned when needed. Technology is strange that way. Education, at least in the K-12 range, is more about child rearing than knowledge acquisition. And since child rearing is primarily about forming character, instilling values and cultivating habits, it may be the last area to bedirectly augmented by technology.

A lot of good sense is packed into that paragraph.  And a lot of good sense also informs the principles for technology literacy that Kelly sought to instill in his son.

• Every new technology will bite back. The more powerful its gifts, the more powerfully it can be abused. Look for its costs.

• Technologies improve so fast you should postpone getting anything you need until the last second. Get comfortable with the fact that anything you buy is already obsolete.

• Before you can master a device, program or invention, it will be superseded; you will always be a beginner. Get good at it.

• Be suspicious of any technology that requires walls. If you can fix it, modify it or hack it yourself, that is a good sign.

• The proper response to a stupid technology is to make a better one, just as the proper response to a stupid idea is not to outlaw it but to replace it with a better idea.

• Every technology is biased by its embedded defaults: what does it assume?

• Nobody has any idea of what a new invention will really be good for. The crucial question is, what happens when everyone has one?

• The older the technology, the more likely it will continue to be useful.

• Find the minimum amount of technology that will maximize your options.

In his contribution Jaron Lanier, who is becoming something of a regular on this blog, asks “Does the Digital Classroom Enfeeble the Mind?” The most significant observations in Lanier’s piece come in the last few paragraphs.  Forgive the rather large block quote, but it would be hard to abridge further.  The italics below are mine and they emphasize some key observations. [Make that bold type, since the whole block quote is in italics!]

The deeper concern, for me, is the philosophy conveyed by a technological design. Some of the top digital designs of the moment, both in school and in the rest of life, embed the underlying message that we understand the brain and its workings. That is false. We don’t know how information is represented in the brain. We don’t know how reason is accomplished by neurons. There are some vaguely cool ideas floating around, and we might know a lot more about these things any moment now, but at this moment, we don’t.

You could spend all day reading literature about educational technology without being reminded that this frontier of ignorance lies before us. We are tempted by the demons of commercial and professional ambition to pretend we know more than we do. This hypnotic idea of omniscience could kill the magic of teaching, because of the intimacy with which we let computers guide our brains.

At school, standardized testing rules. Outside school, something similar happens. Students spend a lot of time acting as trivialized relays in giant schemes designed for the purposes of advertising and other revenue-minded manipulations. They are prompted to create databases about themselves and then trust algorithms to assemble streams of songs and movies and stories for their consumption.

We see the embedded philosophy bloom when students assemble papers as mash-ups from online snippets instead of thinking and composing on a blank piece of screen. What is wrong with this is not that students are any lazier now or learning less. (It is probably even true, I admit reluctantly, that in the presence of the ambient Internet, maybe it is not so important anymore to hold an archive of certain kinds of academic trivia in your head.)

The problem is that students could come to conceive of themselves as relays in a transpersonal digital structure. Their job is then to copy and transfer data around, to be a source of statistics, whether to be processed by tests at school or by advertising schemes elsewhere.

What is really lost when this happens is the self-invention of a human brain. If students don’t learn to think, then no amount of access to information will do them any good.

There is much to think about in those paragraphs, even beyond the realm of education, that echoes the premise of Lanier’s recent book, You Are Not a Gadget: A Manifesto.  Each of the emphasized lines could sustain a very long conversation.  There was one element of Lanier’s piece, however, that caused me to wonder if something was not missing.  According to Lanier,

To the degree that education is about the transfer of the known between generations, it can be digitized, analyzed, optimized and bottled or posted on Twitter. To the degree that education is about the self-invention of the human race, the gargantuan process of steering billions of brains into unforeseeable states and configurations in the future, it can continue only if each brain learns to invent itself. And that is beyond computation because it is beyond our comprehension. Learning at its truest is a leap into the unknown.

My question involves that first line about education as “the transfer of the known between generations.”  Only when what is known is understood as raw data can it really be consider fit for digitization and communication by computer.  There is a good deal that is passed on from one generation to another (at least ideally) that doesn’t amount to raw data.  What happens to wisdom, morality, embodied and un-articulated ways of being and doing in the world, modes of speech, rituals, judgment and more that counts as the kind of education-as-character-formation that Kelly cited in his piece?

Nonetheless, there is much to commend in Lanier’s piece, and he concludes by referring back to his father’s method of teaching math in the classroom, having students build a spaceship,

Roughly speaking, there are two ways to use computers in the classroom. You can have them measure and represent the students and the teachers, or you can have the class build a virtual spaceship. Right now the first way is ubiquitous, but the virtual spaceships are being built only by tenacious oddballs in unusual circumstances. More spaceships, please.

The Hornbook: Early classroom technology

Virginia Heffernan’s “Drill, Baby, Drill”, which revisits the benefits of drilling and rote memorization in the classroom, suggests that maybe Lanier won’t have to reluctantly admit “that in the presence of the ambient Internet, maybe it is not so important anymore to hold an archive of certain kinds of academic trivia in your head.”  According University of Virginia psychology professor Daniel Willingham,

“You can’t be proficient at some academic tasks without having certain knowledge be automatic — ‘automatic’ meaning that you don’t have to think about it, you just know what to do with it.” For knowledge that must be automatic, like multiplication tables, “you need something like drilling,” Willingham wrote.

And lastly, “Online Curiosity Killer” by Ben Greenman tells the story of one father’s decision to put off turning to Google for immediate answers to his son’s questions in order to cultivate the kind of frustration that will generate interest:

By supplying answers to questions with such ruthless efficiency, the Internet cuts off the supply of an even more valuable commodity: productive frustration. Education, at least as I remember it, isn’t only, or even primarily, about creating children who are proficient with information. It’s about filling them with questions that ripen, via deferral, into genuine interests.

There is much else on offer in the Times special issue including the cover piece on video games and learning, an interactive time-line on the history of technology in the classroom, and an interview with Secretary of Education Arne Duncan.

Across the political and cultural spectrum, we recognize the significance of education.  Thinking carefully about the role of technology in education is unavoidable.  We’ve got a lot of thinking to do.

_________

Related post:  “Questionable Classrooms”

It’s Not a Game, It’s an Experience — Mark Cuban Channels Don Draper

In the first season finale of the AMC series Mad Men, Don Draper famously pitches an ad campaign for Kodak’s new slide projector in which he suggests, with appropriately melodramatic music in the background,

This is not a spaceship, it’s a time machine … It goes backwards and forwards, and it takes us to a place where we ache to go again … It’s not called ‘The Wheel.’ It’s called ‘The Carousel.’ It lets us travel around and around and back home again.

The scene was effectively parodied by SNL shortly thereafter.  You can see the scene below and watch the parody at Hulu.

Over the top perhaps, but it did convey Madison Avenue’s awareness that selling a product involves connecting with something deeper than utility or effectiveness.  Think what you will of the Dallas Maverick’s sometimes controversial owner Mark Cuban, he has perceptively argued, along similar lines, that those in the professional sports business are not selling games, they are selling experiences.  In a recent blog post he writes,

We in the sports business don’t sell the game, we sell unique, emotional experiences.We are not in the business of selling basketball. We are in the business of selling fun and unique experiences. I say it to our people at the Mavs at all time, I want a Mavs game to be more like a great wedding than anything else.

Ultimately, his post ends up being about technologies that insert themselves into the experience and thus detract from that experience.

… I hate the trend of handheld video at games. I can’t think of a bigger mistake.  The last thing I want is someone looking down at their phone to see a replay.

This is not unlike Jaron Lanier’s wondering whether we are really there at all when we tweet, blog, or update our status throughout an event or gathering.  Cuban and Lanier, in their own ways, are both arguing for our full presence in our own experience.  He concludes his post with the following observation:

The fan experience is about looking up, not looking down. If you let them look down, they might as well stay at home, the screen is always going to be better there.

This is good advice for life in general.  Look up from the screen.  Who knows, in looking one another in the eyes again, we might begin to recover the habits of respect and civility that are now so sorely missed.