Five Neuroscientists Get On a Raft …

Click for video of the trip from the NY Times

No punchline.  Five neuroscienctists really did get on a raft.

They rafted the San Juan River in southern Utah during a week-long camping trip into the most threatening and inhospitable situation now imaginable:  beyond the reach of wireless signals, they were without Internet and without cell phones (although a satellite phone was available for emergencies).

The trip was conceived and planned by David Strayer, professor of psychology at the University of Utah, with the goal of understanding “how heavy use of digital devices and other technologies” impacts the way we think and act.

Matt Richtel’s account of the journey, “Outdoors and Out of Reach, Studying the Brain,” is part of the NY Times‘ ongoing series, Your Brain On Computers.  That the members of the expedition disagreed from the outset about the impact of the digital world on the brain makes this an engaging read and suggest that the conversation on the trip was quite lively.

Along the way they debate the false sense of urgency engendered by always-on technology, the power of nature to refresh the brain’s ability to focus, the degree to which the brain can adapt to multitasking environments, the best methods and tools to measure digital technologies effect on the brain, and more.

As the days pass, the conversations become more fruitful.  Or, as Richtel put it, “as the river flows, so do the ideas.”  “There’s a real mental freedom in knowing no one or nothing can interrupt you” according to one of the neuroscientists.  Another observes, “Time is slowing down.”

Strayer has coined the term “third-day syndrome” to describe the subtle and not so subtle shifts in attitude and behavior that begin to manifest themselves after someone has been “unplugged” for three days.  The experience leads one of the scientists to wonder, “If we can find out that people are walking around fatigued and not realizing their cognitive potential … What can we do to get us back to our full potential?”

Even without knowing exactly how the trip affected their brains, the scientists are prepared to recommend a little downtime as a path to uncluttered thinking.

If you can, take your downtime in a more natural environment.  According to a University of Michigan study discussed in the article, we seem to learn better after taking a walk in the woods as opposed to a walk down a city block.

For a much more prosaic discussion of the same issues see “The Internet: Is It Changing the Way We Think?” from The Guardian.  Of the five responses, Maryanne Wolf’s seemed to me most useful.  Here is an excerpt:

For me, the essential question has become: how well will we preserve the critical capacities of the present expert reading brain as we move to the digital reading brain of the next generation? Will the youngest members of our species develop their capacities for the deepest forms of thought while reading or will they become a culture of very different readers – with some children so inured to a surfeit of information that they have neither the time nor the motivation to go beyond superficial decoding? In our rapid transition into a digital culture, we need to figure out how to provide a full repertoire of cognitive skills that can be used across every medium by our children and, indeed, by ourselves.

Prosthetic Gods

A cadre of people decked out in half space suits, half combat armor walk through a desolate, arid wilderness toward a bunker.  A door opens revealing a passageway into an abandoned underground installation.  On a platform elevator they descend hundreds of feet.  As they continue through hexagonal corridors they notice a helmet, not unlike theirs, lying ominously on the ground.  Finally, they enter a room where a solitary metallic object suspended in mid-air spins on its axis.  One man removes his armor from his right arm and extends his now bare arm into an opening in the object.  The object stops spinning.  His comrades look on with apprehension; the man pulls out his arm.  As he does so his arm morphs into a mechanical, cyborg arm.  Then, and this is the climax, from the palm of his newly mechanized arm, the Droid X emerges.

Now there’s a commercial, and if you haven’t already seen it, you can watch for yourself at the end of this post.  I first saw this commercial sitting in the theater waiting for Inception to begin, only I didn’t immediately realize it was a commercial.  Had I walked in just then I would have assumed the previews had started.  A bit over-the-top perhaps, but maybe not.

There’s a lot that can be said about this elaborate piece of sci-fi marketing, but let’s take it at face value.  It is actually a rather straightforward dramatization of an important and intriguing metaphor:  technology as prosthesis.  Marshall McLuhan, patron saint of media studies, popularized the concept that our tools or technologies function as prosthetic extensions of our bodies.  For example, the hammer functions as an extension of the hand, the wheel as an extension of the foot, or electric technology functions as an extension of the nervous system.  McLuhan, however, was neither the first nor the last to employ the metaphor.  In Civilization and Its Discontents, Sigmund Freud suggested that,  “Man has, as it were, become a kind of prosthetic god.”   But man also wore his prosthetic divinity awkwardly.  Freud goes on to say, “When he puts on all his auxiliary organs he is truly magnificent; but those organs have not grown on to him and they still give him much trouble at times.”

Technology as a prosthetic enhancement has been a rich concept deployed by a variety of philosophers and critics including Martin Heidegger, Jacques Derrida, and Donna Haraway.  In her “Cyborg Manifesto,” Haraway in particular argued that our technologies have been making the line between natural and artificial, machine and organism, cyborg and human more than a little fuzzy.  Often the idea of technology as prosthetic is paired with the related metaphor of amputation — something gained, something lost — so that on the whole there is a certain ambiguity about our prosthetic tools.  You can read more about the concept in a well-written overview here, but I want to focus on the very simple idea that our technologies became a part of us.

Think about this in light of the question that I asked in yesterday’s post, “A God that Limps.” Why do we react so defensively when we hear someone criticize our technologies?  The concept of prosthesis suggests a compelling response:  because we take it not as a criticism of some object apart from us, but rather as an object that has become in some sense a part of us.  We hear such criticism as a criticism of ourselves.

The more seamlessly a technologies blends in with our bodies, the more attached we become.  Take the Blue Tooth enhanced cell phone, for example, responsible for all those people seemingly talking to themselves.  Notice how this metaphor helps explain that odd development.  The device has become transparent, we forget it is even there.  This makes the communication seem almost unmediated consequently causing us to act as naturally as if we were in the person’s presence (and only that person’s presence).  Or take the iTouch/iPhone/iPad that allows us to magically touch the Internet; now that is an extension of the central nervous system!  Gone is the clunky mouse or keyboard, we now appear to be touching the information itself, the layers of mediation seem to be peeling away.

The better these tools work, the more invisible they become; or, as the Droid X commercial suggests, the more they become a part of us.  Tweaking Arthur C. Clarke’s Third Law just a little, we might say that any sufficiently advanced technology is indistinguishable from our bodies.  Naturally, we are pretty defensive of our bodies; not surprisingly we tend to be pretty defensive of our technologies as well.

Book v. Computer

The results are in (at least tentatively).

From the Freakonomics folks at the NY Times:

More evidence that technology doesn’t always equal higher test scores: a new working paper by Jacob L. Vigdor and Helen F. Ladd examines the effects of home computer and internet access on test scores.  Consistent with the research of Ofer Malamud and Christian Pop-Eleches, Vigdor and Ladd found that “the introduction of home computer technology is associated with modest but statistically significant and persistent negative impacts on student math and reading test scores.”

Meanwhile from a study reported on by The Chronicle of Higher Education:

What’s surprising … is just how strong the correlation is between a child’s academic achievement and the number of books his or her parents own. It’s even more important than whether the parents went to college or hold white-collar jobs. Books matter. A lot.  The study was conducted over 20 years, in 27 countries, and surveyed more than 70,000 people. Researchers found that children who grew up in a home with more than 500 books spent 3 years longer in school than children whose parents had only a few books. Also, a child whose parents have lots of books is nearly 20-percent more likely to finish college. For comparison purposes, the children of educated parents (defined as people with at least 15 years of schooling) were 16-percent more likely than the children of less-educated parents to get their college degrees. Formal education matters, but not as much as books.

This may of course assume that one doesn’t necessarily believe that quantity trumps quality.  For more on both studies visit Rough Type.

Distracted from distraction by distraction

The title is taken from the first of T. S. Eliot’s” Four Quartets,” Burnt Norton.  Here is a bit more context:

Neither plenitude nor vacancy.  Only a flicker
Over the strained time-ridden faces
Distracted from distraction by distraction
Filled with fancies and empty of meaning
Tumid apathy with no concentration
Men and bits of paper, whirled by the cold wind
That blows before and after time,
Wind in and out of unwholesome lungs
Time before time and after.

Although these lines were first published near the middle of the last century, they unmistakably resonate with the present.  The problem of distraction facilitated and encouraged by Internet culture is among the chief targets of Nicholas Carr’s The Shallows, the public reception of which has been noted in some previous posts.  The reviews keep coming in.  Todd Gitlin’s review, “The Uses of Half-True Alarms,” appeared in The New Republic on June 7th.  Gitlin is clearly hesitant to embrace Carr’s claims in their fullness, but one is left with the sense that he is largely sympathetic to Carr’s argument.

I was especially pleased to see Gitlin express the obvious rejoinder to a frequent criticism directed toward Carr and others who raise questions about our too often uncritical embrace of all things technological.   This criticism is on display, for example, in the opening paragraphs of Steven Pinker’s recent op-ed in the NY Times.  Pinker never mentions Carr or his recent book, but they are clearly in view.  In its most basic form, the criticism runs something like this:  “moral panic” (to use Pinker’s phrase) is present whenever a new technology comes on the scene, society has obviously survived those past technological developments thus the panic was misguided,  and hence we should not pay too much attention to those who are raising questions about new technologies today.

There are a number of problems with this line of reasoning, but after offering his own version of the argument, Gitlin to his credit aptly stated perhaps the most obvious one:

Carr would no doubt respond that a repeated alarm is not necessarily a false alarm, and he would be right.

In his review, Gitlin cites Eliot’s lines which Carr employs in The Shallows.  Interestingly, last night I had been reading Paul Johnson’s chapter on Eliot in his 2006 book Creators:  From Chaucer and Durer to Picasso and Disney.  With all the recent attention drawn to our lack of attention in mind, I was struck by the following passage:

Eliot always worked hard at whatever he was doing, being conscientious, and consumed with guilt if he was “lazy” (a rare state), and having moreover the priceless gift of concentrating.  He could set to work immediately, first thing in the morning, without any time-consuming preliminary fiddling or rituals.  If interrupted, he could refocus immediately and resume work.  The intensity with which he worked was almost frightening.

Eliot, as Johnson describes him, appears to be the stark antithesis of the Internet Man.  The concern of some is that men and women of Eliot’s prodigious talent and skills will be increasingly rare in a culture dominated by the habits and priorities of the Internet.  Time will tell of course.  Needless to say, men and women of that caliber are rare in any age.  Will they become extinct?  In my more hopeful moments, I think not.  Unfortunately, this minimal optimism — surely a few men and women of genius will still emerge — is another frequent line of response to Carr’s thesis.  It is not as if before the Internet most of us were reading Proust and honing our cello skills.  More likely, so the argument goes, we were wasting our time with that last invention to a spark “moral panic,” the television.  (This is, I suspect, true enough.  However, one wonders how different things would be if a prior generation had heeded warnings about that screen and its dangers.)

But here again I find the argument misguided.  Of course, all of us have never been all that we could be.  The use of this argument against Carr and the like is rather depressing.  The assumption seems to be, “No worries, we’ve always been mediocre and always will be.”  This may be true, but it is a symptom of some kind of cultural anemia that we now embrace this line of thinking in defense of our gadgets and our toys.  The question is not whether we have in the past made any better use of our time, the question is whether our tools and our social climate in general are more or less conducive to the pursuit of some kind of excellence, however halting the pursuit.  Johnson noted a certain guilt that Eliot experienced when he perceived himself to have failed to use his time well.  It is perhaps the general absence of such guilt in the Wireless Age that is most telling of our present ills.


 

You can subscribe to my newsletter, The Convivial Society, here.