Techno-Literacy, Digital Classrooms, Curiosity Killers, and More

This year’s NY Times Magazine Education Issue is out and it is devoted to a topic we’ve given a good deal of attention to here — technology in the classroom.  Below are a few of the highlights.  If you click through to read the whole articles you may be prompted to register with the Times’ website, but it is quick and free.

In “Achieving Techno-Literacy” by Kevin Kelly we get a brief glimpse at a family that decided to home school their child for one year before he entered high school.  Kelly notes that one of the surprises they encountered was “that the fancy technology supposedly crucial to an up-to-the-minute education was not a major factor in its success.”  There were technologies involved, of course, like a homemade bow to make fire and more recent varieties as well.  Yet, Kelly explains that,

… the computer was only one tool of many. Technology helped us learn, but it was not the medium of learning. It was summoned when needed. Technology is strange that way. Education, at least in the K-12 range, is more about child rearing than knowledge acquisition. And since child rearing is primarily about forming character, instilling values and cultivating habits, it may be the last area to bedirectly augmented by technology.

A lot of good sense is packed into that paragraph.  And a lot of good sense also informs the principles for technology literacy that Kelly sought to instill in his son.

• Every new technology will bite back. The more powerful its gifts, the more powerfully it can be abused. Look for its costs.

• Technologies improve so fast you should postpone getting anything you need until the last second. Get comfortable with the fact that anything you buy is already obsolete.

• Before you can master a device, program or invention, it will be superseded; you will always be a beginner. Get good at it.

• Be suspicious of any technology that requires walls. If you can fix it, modify it or hack it yourself, that is a good sign.

• The proper response to a stupid technology is to make a better one, just as the proper response to a stupid idea is not to outlaw it but to replace it with a better idea.

• Every technology is biased by its embedded defaults: what does it assume?

• Nobody has any idea of what a new invention will really be good for. The crucial question is, what happens when everyone has one?

• The older the technology, the more likely it will continue to be useful.

• Find the minimum amount of technology that will maximize your options.

In his contribution Jaron Lanier, who is becoming something of a regular on this blog, asks “Does the Digital Classroom Enfeeble the Mind?” The most significant observations in Lanier’s piece come in the last few paragraphs.  Forgive the rather large block quote, but it would be hard to abridge further.  The italics below are mine and they emphasize some key observations. [Make that bold type, since the whole block quote is in italics!]

The deeper concern, for me, is the philosophy conveyed by a technological design. Some of the top digital designs of the moment, both in school and in the rest of life, embed the underlying message that we understand the brain and its workings. That is false. We don’t know how information is represented in the brain. We don’t know how reason is accomplished by neurons. There are some vaguely cool ideas floating around, and we might know a lot more about these things any moment now, but at this moment, we don’t.

You could spend all day reading literature about educational technology without being reminded that this frontier of ignorance lies before us. We are tempted by the demons of commercial and professional ambition to pretend we know more than we do. This hypnotic idea of omniscience could kill the magic of teaching, because of the intimacy with which we let computers guide our brains.

At school, standardized testing rules. Outside school, something similar happens. Students spend a lot of time acting as trivialized relays in giant schemes designed for the purposes of advertising and other revenue-minded manipulations. They are prompted to create databases about themselves and then trust algorithms to assemble streams of songs and movies and stories for their consumption.

We see the embedded philosophy bloom when students assemble papers as mash-ups from online snippets instead of thinking and composing on a blank piece of screen. What is wrong with this is not that students are any lazier now or learning less. (It is probably even true, I admit reluctantly, that in the presence of the ambient Internet, maybe it is not so important anymore to hold an archive of certain kinds of academic trivia in your head.)

The problem is that students could come to conceive of themselves as relays in a transpersonal digital structure. Their job is then to copy and transfer data around, to be a source of statistics, whether to be processed by tests at school or by advertising schemes elsewhere.

What is really lost when this happens is the self-invention of a human brain. If students don’t learn to think, then no amount of access to information will do them any good.

There is much to think about in those paragraphs, even beyond the realm of education, that echoes the premise of Lanier’s recent book, You Are Not a Gadget: A Manifesto.  Each of the emphasized lines could sustain a very long conversation.  There was one element of Lanier’s piece, however, that caused me to wonder if something was not missing.  According to Lanier,

To the degree that education is about the transfer of the known between generations, it can be digitized, analyzed, optimized and bottled or posted on Twitter. To the degree that education is about the self-invention of the human race, the gargantuan process of steering billions of brains into unforeseeable states and configurations in the future, it can continue only if each brain learns to invent itself. And that is beyond computation because it is beyond our comprehension. Learning at its truest is a leap into the unknown.

My question involves that first line about education as “the transfer of the known between generations.”  Only when what is known is understood as raw data can it really be consider fit for digitization and communication by computer.  There is a good deal that is passed on from one generation to another (at least ideally) that doesn’t amount to raw data.  What happens to wisdom, morality, embodied and un-articulated ways of being and doing in the world, modes of speech, rituals, judgment and more that counts as the kind of education-as-character-formation that Kelly cited in his piece?

Nonetheless, there is much to commend in Lanier’s piece, and he concludes by referring back to his father’s method of teaching math in the classroom, having students build a spaceship,

Roughly speaking, there are two ways to use computers in the classroom. You can have them measure and represent the students and the teachers, or you can have the class build a virtual spaceship. Right now the first way is ubiquitous, but the virtual spaceships are being built only by tenacious oddballs in unusual circumstances. More spaceships, please.

The Hornbook: Early classroom technology

Virginia Heffernan’s “Drill, Baby, Drill”, which revisits the benefits of drilling and rote memorization in the classroom, suggests that maybe Lanier won’t have to reluctantly admit “that in the presence of the ambient Internet, maybe it is not so important anymore to hold an archive of certain kinds of academic trivia in your head.”  According University of Virginia psychology professor Daniel Willingham,

“You can’t be proficient at some academic tasks without having certain knowledge be automatic — ‘automatic’ meaning that you don’t have to think about it, you just know what to do with it.” For knowledge that must be automatic, like multiplication tables, “you need something like drilling,” Willingham wrote.

And lastly, “Online Curiosity Killer” by Ben Greenman tells the story of one father’s decision to put off turning to Google for immediate answers to his son’s questions in order to cultivate the kind of frustration that will generate interest:

By supplying answers to questions with such ruthless efficiency, the Internet cuts off the supply of an even more valuable commodity: productive frustration. Education, at least as I remember it, isn’t only, or even primarily, about creating children who are proficient with information. It’s about filling them with questions that ripen, via deferral, into genuine interests.

There is much else on offer in the Times special issue including the cover piece on video games and learning, an interactive time-line on the history of technology in the classroom, and an interview with Secretary of Education Arne Duncan.

Across the political and cultural spectrum, we recognize the significance of education.  Thinking carefully about the role of technology in education is unavoidable.  We’ve got a lot of thinking to do.

_________

Related post:  “Questionable Classrooms”

It’s Not a Game, It’s an Experience — Mark Cuban Channels Don Draper

In the first season finale of the AMC series Mad Men, Don Draper famously pitches an ad campaign for Kodak’s new slide projector in which he suggests, with appropriately melodramatic music in the background,

This is not a spaceship, it’s a time machine … It goes backwards and forwards, and it takes us to a place where we ache to go again … It’s not called ‘The Wheel.’ It’s called ‘The Carousel.’ It lets us travel around and around and back home again.

The scene was effectively parodied by SNL shortly thereafter.  You can see the scene below and watch the parody at Hulu.

Over the top perhaps, but it did convey Madison Avenue’s awareness that selling a product involves connecting with something deeper than utility or effectiveness.  Think what you will of the Dallas Maverick’s sometimes controversial owner Mark Cuban, he has perceptively argued, along similar lines, that those in the professional sports business are not selling games, they are selling experiences.  In a recent blog post he writes,

We in the sports business don’t sell the game, we sell unique, emotional experiences.We are not in the business of selling basketball. We are in the business of selling fun and unique experiences. I say it to our people at the Mavs at all time, I want a Mavs game to be more like a great wedding than anything else.

Ultimately, his post ends up being about technologies that insert themselves into the experience and thus detract from that experience.

… I hate the trend of handheld video at games. I can’t think of a bigger mistake.  The last thing I want is someone looking down at their phone to see a replay.

This is not unlike Jaron Lanier’s wondering whether we are really there at all when we tweet, blog, or update our status throughout an event or gathering.  Cuban and Lanier, in their own ways, are both arguing for our full presence in our own experience.  He concludes his post with the following observation:

The fan experience is about looking up, not looking down. If you let them look down, they might as well stay at home, the screen is always going to be better there.

This is good advice for life in general.  Look up from the screen.  Who knows, in looking one another in the eyes again, we might begin to recover the habits of respect and civility that are now so sorely missed.

Technological Momentum and Education

“There is absolutely no inevitability as long as there is a willingness to contemplate what is happening.”  — Marshall McLuhan, The Medium is the Massage

Conversations about technology and education, in my experience, eventually invoke certain vague notions of inevitability. There is often talk about getting on the train before it leaves the station and all of that.  Perhaps it is the case that notions of inevitability will surface in most discussions about technology whether or not education is involved — the specter of technological determinism casts a long shadow. I am not a technological determinist. Nevertheless, I do believe technology influences us in significant ways. How do we describe this condition of being influenced, but not determined?

The concept of technological momentum employed by historian Thomas Hughes provides a helpful way of thinking about this question.  In Technology Matters: Questions to Live With, David Nye explains Hughes concept and offers some examples.

Hughes argues that technical systems are not infinitely malleable.  If technologies such as the bicycle or the automobile are not independent forces shaping history, they can still exercise a “soft determinism” once they are in place …

“Technological momentum” is not inherent in any technological system when first deployed. It arises as a consequence of early development and successful entrepreneurship, and it emerges at the culmination of a period of growth. The bicycle had such momentum in Denmark and the Netherlands from 1920 until the 1960s, with the result that a system of paved trails and cycling lanes were embedded in the infrastructure before the automobile achieved momentum. In the United States, the automobile became the center of a socio-technical system more quickly and achieved momentum a generation earlier. Only some systems achieve “technological momentum” …. The concept seems particularly useful for understanding large systems. These have some flexibility when being defined in their initial phases. But as technical specifications are established and widely adopted, and as a system comes to employ a bureaucracy and thousands of workers, it becomes less responsive to outside pressures …

Hughes makes clear when discussing “inertia” that the concept is not only technical but also cultural and institutional. A society may choose to adopt either direct current or alternating current, or to use 110 volts, or 220 volts, or some other voltage, but a generation after these choices have been made it is costly and difficult to undo such a decision. Hundreds of appliance makers, thousands of electricians, and millions of homeowners have made a financial commitment to these technical standards. Furthermore, people become accustomed to particular standards and soon begin to regard them as natural. Once built, an electrical grid is “less shaped by and more the shaper of its environment.” This may sound deterministic, but it is not entirely so, for people decided to build the grid and selected its specifications and components. To later generations, however, such technical systems seem to be deterministic.

Coming back to the more specific topic of technology in education in light of Nye’s observations, I want to suggest that teachers and administrators think carefully about the implementation of technology, particularly in its early stages.  There is no inevitability.  We have choices to make.  Those choices may lead to the adoption of certain technologies and corresponding practices, and later the institutionalization of those technologies and practices may eventually make it very hard to discard them.  This kind of inertia is what retrospectively makes the adoption and implementation of certain technologies appear inevitable.  But at the outset, there were choices to be made.

It is probably the case that in some circumstances the choice is not really a choice at all.  For example, in certain industries one may either have to constantly adopt and adapt or else lose business and fail.  Exercise of choice may also lead to marginalization — witness the Amish.  Choices come with consequences and costs.  I grant that those costs may sometimes amount to coercive pressure.

Perhaps education is one of these industries (calling it such is already to prejudice the matter) in which this sort of coercive pressure exists.  One hopes, however, that better aims and ideals are steering the ship. Teachers and administrators need to be clear about their philosophy of education, and they need to allow their vision for education to drive their choices about the adoption and implementation of new technology. If they are not self-conscious and intentional in this respect, and if they view technology merely as a neutral set of tools at their disposal, they will be disappointed and frustrated.

As media theorists have noted, the ecological metaphor can be a helpful way of thinking about and understanding our technologies. Once a new element is introduced into an ecosystem, we don’t get the same ecosystem plus a new element; we get a new ecosystem. The consequences may be benign, or they could be destructive. Think of the classroom as an ecosystem; the introduction of new technologies reconstitutes the classroom’s media ecosystem. Consequently, the adoption and implementation of new classroom technologies should be guided by clear thinking about how new technologies alter the learning environment and a sober estimation of their compatibility with a school’s philosophy of education.

 

“Are you really there?” — How not to become specatators of our lives

If you try to keep up with the ongoing debate regarding the Internet and the way it is shaping our world and our minds, you will inevitably come across the work of Jaron Lanier.  When you do, stop and take note.  Lanier qualifies as an Internet pessimist in Adam Thierer’s breakdown of The Great Debate over Technology’s Impact on Society, but he is an insightful pessimist with a long history in the tech industry.  Unlike other, often insightful, critics such as the late Neil Postman and Nicholas Carr, Lanier speaks with an insider’s perspective.  We noted his most recent book, You Are Not a Gadget: A Manifesto,not long ago.

Earlier this week, I ran across a short piece Lanier contributed to The Chronicle of Higher Education in response to the question, “What will be the defining idea of the coming decade, and why?” Lanier’s response, cheerfully titled “The End of Human Specialness,” was one of a number of responses solicited by The Chronicle from leading scholars and illustrators.  In his piece, Lanier recalls addressing the “common practice of students blogging, networking, or tweeting while listening to a speaker” and telling his audience at the time,

The most important reason to stop multitasking so much isn’t to make me feel respected, but to make you exist. If you listen first, and write later, then whatever you write will have had time to filter through your brain, and you’ll be in what you say. This is what makes you exist. If you are only a reflector of information, are you really there?

We have all experienced it; we know exactly what Lanier is talking about.  We’ve seen it happen, we’ve had it happen to us, and — let’s be honest — we have probably also been the offending party.  Typically this topic elicits a rant against the incivility and lack of respect such actions communicate to those who are on the receiving end, and that is not  unjustified.  What struck me about Lanier’s framing of the issue, however, was the emphasis on the person engaged in the habitual multitasking and not on the affront to the one whose presence is being ignored.

We are virtually dispersed people.  Our bodies are in one place, but our attention is in a dozen other places and, thus,  nowhere at all.  This is not entirely new; there are antecedents.  Long before smart phones enabled a steady flow of distraction and allowed us to carry on multiple interactions simultaneously, we wandered away into the daydreams our imagination conjured up for us.  My sense, however, is that such retreats into our consciousness are a different sort of  thing than our media enabled evacuations of the place and moment we inhabit.  For one thing, they were not nearly so frequent and intrusive. We might also argue that when we daydream our attention is in fact quite focused in one place, the place of our dream.  We are somewhere rather than nowhere.

Whatever we think of the antecedents, however, it is clear that many of us are finding it increasingly difficult to be fully present in our own experience.  Perhaps part of the what is going on is captured by the old adage about the man with a hammer to whom everything looks like a nail.  My most vivid experience with this dynamic came years ago with my first digital camera.  To the person with a digital camera (and enough memory), I discovered, everything looks like a picture and you can’t help but take it.  I have wonderful pictures of Italy, but very few memories.  And so we may  extrapolate:  to the person with a Twitter account, everything is a tweet waiting to be condensed into 140 characters.  To the person with a video recorder on their phone, everything is a moment to be documented.  To the person with an iPhone … well, pick the App.

In an article written by professor Barry Mauer, I recently learned about Andy Warhol’s obsessive documentation of his own experience through photographs, audiotape, videotape, and film.  In his biography of Warhol, Victor Bockris writes,

Indeed, Andy’s desire to record everything around him had become a mania.  As John Perrault, the art critic, wrote in a profile of Warhol in Vogue:  “His portable tape recorder, housed in a black briefcase, is his latest self-protection device.  The microphone is pointed at anyone who approaches, turning the situation into a theater work.  He records hours of tape every day but just files the reels away and never listens to them.”

Warhol’s behavior would, I suspect, seem less problematic today.  Here too he was perhaps simply ahead of his time.  Given much more efficient tools, we are also obsessively documenting our lives.  But what most people do tends to be viewed as normal.  It is interesting, though, that Perrault referred to Warhol’s tape recorder as a “self-protective device.”  It called to mind R. R. Reno’s analysis of the pose of ironic detachment so characteristic of our society:

We enjoy an irony that does not seek resolution because it supports our desire to be invulnerable observers rather than participants at risk. We are spectators of our lives, free from the strain of drama and the uncertainty of a story in which our souls are at stake.

“Spectators of our lives.”  The phrase is arresting, and the prospect is unsettling.  But it is hardly necessary or inevitable.  If the cost of re-engaging our own lives, of becoming participants at risk in the unfolding drama of our own story is a few less photos that we may end up deleting anyway, one less Facebook update from our phone, or one text left unread for a short while, then that is a price well worth paying.  We will be better for it, and those others, in whose presence we daily live, will be as well.

Understanding as a Mode of Resistance

Most everyone knows by now that it was the late Marshall McLuhan who told us that “the medium is the message” and who also first alerted us to the emergence of the “global village.”  He is widely recognized as a communication and media theorist of abiding significance and among the most astute observers of our technological age.  Not surprisingly, in its 1993 debut issue, Wired magazine adopted McLuhan as its patron saint .

Depending on how familiar one is with McLuhan, however, the following exchange from an interview he gave in 1966 may be a bit surprising:

Fulford:  What kind of a world would you rather live in?  Is there a period in the past or a possible period in the future you’d rather be in?

McLuhan:  No, I’d rather be in any period at all as long as people are going to leave it alone for a while.

Fulford:  But they’re not going to, are they?

McLuhan:  No, and so the only alternative is to understand everything that is going on, and then neutralize it as much as possible, turn off as many buttons as you can, and frustrate them as much as you can.  I am resolutely opposed to all innovation, all change, but I am determined to understand what’s happening because I don’t choose just to sit and let the juggernaut roll over me.  Many people seem to think that if you talk about something recent, you’re in favor of it.  The exact opposite is true in my case.  Anything I talk about is almost certainly to be something I’m resolutely against, and it seems to me the best way of opposing it is to understand it, and then you know where to turn off the button.

(Understanding Me:  Lectures and Interviews, 101-102)