Against Technological Shortcuts, or Why How We Learn Matters As Much As What We Learn

I remember having a discussion with students a couple of years ago about the desirability of instantly acquired knowledge or expertise. It was a purely hypothetical discussion, and I don’t quite remember how we got around to it. Somehow, though, we found ourselves discussing a Matrix-like or Google-chip-type scenario in which it would be possible to instantly download the contents of a book or martial art skills into the brain. The latter, of course, begs all sorts of questions about the relationship between the mind and the body (and so the does the former for that matter), but let’s set those questions aside for the moment. My argument at the time, and the one I’d like to briefly articulate here, was that even if we were able to acquire knowledge through such a transaction, we should not really want to.

It’s not an easy argument to make. As you can imagine many students were rather keen on the notion of foregoing hours of study and, just to be clear, the appeal is not altogether lost on me as I glance at the mounting tower of books that looms nearby, Babel-like. And the appeal is not just a function of the demands of an academic setting either. I am the sort of person that is more than a little pained by the thought of all that I will never read given the unyielding limitations of a human life. Moreover, who wouldn’t want to possess all of the knowledge that could be so easily attained? (Interestingly, it is tacitly assumed in hypothetical discussions of this sort that retention is no longer a problem.)

This discussion came to mind recently because it struck me that the proposition in question — the desirability of achieving the end while foregoing the means — takes on a certain plausibility within technological society. In fact, it may be the very heart of the promise held out by technology. Efficiency, ease, speed — this is what technology offers. Get what you’ve always wanted, only get it with less hassle and get it faster. The ends are relatively fixed, but technology reconfigures the means by which we achieve them.

This is the story of automation, for example; a machine steps in to do for us what we previously had to do for ourselves. Consider this recent post from Kevin Kelly in which he outlined “The 7 Stages of Robot Replacement” as follows:

A robot/computer cannot possibly do what I do.

OK, it can do a lot, but it can’t do everything I do.

OK, it can do everything I do, except it needs me when it breaks down, which is often.

OK, it operates without failure, but I need to train it for new tasks.

Whew, that was a job that no human was meant to do, but what about me?

My new job is more fun and pays more now that robots/computers are doing my old job.

I am so glad a robot cannot possibly do what I do.

Kelly, as always, is admirably optimistic. But this seems to me to beg certain questions: What exactly is the end game here? Where does this trajectory culminate? Are there no good reasons to oppose the outsourcing of human involvement in the means side of our projects and actions?

Let me go back to the matter of reading and knowledge, in part because this is the context in which I originally formulated my scattered thoughts on this question. There is a certain unspoken assumption that makes the possibility of instantly acquiring knowledge plausible and seemingly unproblematic: that knowledge is merely aggregated data and its mode of acquisition does nothing to alter its status. But what if this were a rather blinkered view of knowledge? And what if the acquisition of knowledge, however understood, was itself only a means to other more important ends?

If the work of learning is ultimately subordinate to becoming a certain kind of person, then it matters very much how we go about learning. In some sense, it may matter more than what we learn. This is because  the manner in which we go about acquiring knowledge constitutes a kind of practice that over the long haul shapes our character and disposition in non-trivial ways. Acquiring knowledge through apprenticeship, for example, shapes people in a certain way, acquiring knowledge through extensive print reading in another, and through web based learning in still another. The practice which constitutes our learning, if we are to learn by it, will instill certain habits, virtues, and, potentially, vices — it will shape the kind of person we are becoming.

As an aside, this consideration bears significantly upon the digital humanities project. (You can read a recent piece about the digital humanities here.) The knowledge achieved by the computer-mediated work of digital humanists will be acquired through practices that diverge from the work of print based scholars, just as the practices associated with their work diverged from those associated with medieval scholastics. New practices will yield new sensibilities, new habits, new dispositions. The digital humanities can produce impressive and well-executed works that genuinely advance our understanding of the humanistic disciplines, so this is not exactly a critique so much as an observation.

This is one reason, then, why the means through which knowledge is acquired matters: it can shape the sort of person you become in the long run. Another has to do with the pleasure that attends the process. Of course, if one has not learned to take pleasure from reading or, to take another example, the physical training associated with athletic excellence, then this point will ring rather hollow. Let me just note that if I could immediately acquire the knowledge of a 1,000 books, I will know that I had missed out on a considerable amount of enjoyment along the way. The sort of enjoyment that leads us to pause as we approach the end of a book we will be rather sad to close.

All of this is also closely related to the undesirability of a frictionless life. When I seek to remove all work, all trouble, all resistance that stands between me and some object of desire, my attainment of that object will be simultaneously rendered meaningless. But finally, it may be mostly about virtue. What do I desire when I am lured by the promise of instant knowledge. It seems to me that since it is not the pleasure that attains to the work and accomplishment of its acquisition, then it is just the power or prestige that it may bring. The elimination of the work associated with gaining knowledge or skill, then, may not be a function of sloth but rather of pride.

And as with knowledge, so with countless other facets of human experience. Technology promises to reconfigure the means so as to get us the end we desire. If, however, part of what we desire, perhaps without knowing it, is intimately wrapped up with the means of attainment, then it will always be a broken promise.

The Religion of Technology

In the Introduction to The Religion of Technology (1997) historian David Noble writes,

“With the approach of the new millennium, we are witness to two seemingly incompatible enthusiasms, on the one hand a widespread infatuation with technological advance and a confidence in the ultimate triumph of reason, on the other a resurgence of fundamentalist faith akin to a religious revival.”

Noble believes that this will strike many readers as an incongruous juxtaposition. On the one hand, it had been assumed by Enlightenment types that the advance of technology (and science) went hand in hand with the retreat of religious belief. The plot, however, turned out to be more complex. “Today,” Noble continues, “we are seeing the simultaneous flourishing of both, not only side by side but hand in hand.”

This should not surprise us according to Noble. It is, and this is his thesis in brief, “a continuation of a thousand-year-old Western tradition in which the advance of the useful arts was inspired by and grounded upon religious expectation.”

On the other hand, “Some contemporary observers have argued … that the resurgence of religious expression testifies to the spiritual sterility of technological rationality, that religious belief is now being renewed as a necessary complement to instrumental reason …” But this view also wrongly assumes a “basic opposition” between technology and religion.

Against this assumption of opposition, Noble argues the following:

“… modern technology and modern faith are neither complements nor opposites, nor do they represent succeeding stages of human development. They are merged, and always have been, the technological enterprise being, at the same time, an essentially religious endeavor.

This is not meant in a merely metaphorical sense, to suggest that technology is similar to religion in that it evokes religious emotions of omnipotence, devotion, and awe, or that it has become a new (secular) religion in and of itself, with its own clerical caste, arcane rituals, and articles of faith. Rather it is meant literally and historically, to indicate that modern technology and religion have evolved together and that, as a result, the technological enterprise has been and remains suffused with religious belief.”

Noble goes on to add that nowhere is this more evident than in the United States where “an unrivaled popular enchantment with technological advance is matched by an equally earnest popular expectation of Jesus Christ’s return.” What is often missed, according to Noble, is that these are often the same people.

Lastly, the suffusion of technology with religious faith manifests itself when

“we routinely expect more from our artificial contrivances than mere convenience, comfort, or even survival. We demand deliverance. This is apparent in our virtual obsession with technological development, in our extravagant anticipations of every new technical advance — however much each fails to deliver on its promise — and, most important, in our utter inability to think and act rationally about this presumably most rational of human endeavors.”

Steve Jobs on Technology and Education circa 1996

Steve Jobs on technology and education from a 1996 Wired interview (via Nick Carr):

“I used to think that technology could help education. I’ve probably spearheaded giving away more computer equipment to schools than anybody else on the planet. But I’ve had to come to the inevitable conclusion that the problem is not one that technology can hope to solve. What’s wrong with education cannot be fixed with technology. No amount of technology will make a dent … Lincoln did not have a Web site at the log cabin where his parents home-schooled him, and he turned out pretty interesting. Historical precedent shows that we can turn out amazing human beings without technology. Precedent also shows that we can turn out very uninteresting human beings with technology. It’s not as simple as you think when you’re in your 20s – that technology’s going to change the world. In some ways it will, in some ways it won’t.”

I’m curious to know if this remained Jobs’ view following his return to Apple. Apple markets its products for schools pretty hard it seems for this to have been a normative position for the company.

As it stands, though, it strikes me as eminently wise.

Update: Just came across this related and helpful post by Audrey Watters — “Steve Jobs, Apple, and the Failure of Education Technology.” From her concluding paragraph:

Education technology in the hands of Apple and Steve Jobs has been a mixed bag. We shouldn’t be so dazzled by his magic that we forget to ask the hard questions about what’s worked and what’s failed and why. Remember: at some point, Apple decided to eschew the education market and build consumer electronics devices. It was a brilliant move, for innovation and for the company’s bottom line. What do we want to make of that? And now, in ways that I think have yet to fully play out, we’re seeing what’s going to happen when these consumer electronic devices enter the classroom.

That seems to answer my question above. Apple, according to Watters, shifted from making products for education to making consumer devices it markets to the field of education. That is a non-trivial distinction.

Weekend Reading, 10/7/11

Due to travel and spotty Internet access there was no “Weekend Reading” post last week, but we’re back on track now. Peace and violence, stalling technological progress, Google, nostalgia, and cell phone sentimentality all come your way below.

“A History of Violence” by Steven Pinker (interviewed) at The Edge. Pinker discusses his recent book, The Better Angels of Our Nature, in which he argues that we live in the least violent era in history.

“Delusions of Peace” by John Gray in Prospect. In which Gray concludes that Pinker is considerably off the mark.

“I Just Called to Say ‘I Love You'” by Jonathan Franzen in Technology Review. Franzen takes on cell phones, sentimentality, and public discourse. A taste: “Just 10 years ago, New York City (where I live) still abounded with collectively maintained public spaces in which citizens demonstrated respect for their community by not inflicting their banal bedroom lives on it.”

“The End of the Future” by Peter Thiel in National Review: A bit of a downer if you buy it. The founder a Paypal worries that technological progress has stalled. A hard sell, but if true, wide ranging and unpleasant social consequences follow. May be related to the subject of the next piece.

“Nostalgia on Repeat” by Chuck Klosterman at Grantland: “The net result is a bunch of people defending and bemoaning the impact of nostalgia in unpredictable ways; I suppose a few of these arguments intrigue me, but just barely. I’m much more interested in why people feel nostalgia, particularly when that feeling derives from things that don’t actually intersect with any personal experience they supposedly had.”

“It Knows” by Daniel Soar in London Review of Books: Soar reviews three recent books on the juggernaut that is Google. Coincidentally, Soar logs this passing snide remark directed at Steven Pinker: “Rankings based on citations aren’t necessarily a measure of excellence – if they were, we wouldn’t hear so much about Steven Pinker – but they do reflect where humans have decided that authority lies.”

Love, Beauty, and Design: What Steve Jobs Understood

It’s been nearly a week without a post and that largely due to some unexpected travel occasioned by less than happy circumstances. And now that I sit down to write again, it is under the shadow of more sad circumstances. It would be hard to have missed the news of Steve Jobs’ death last night. It poured in from every conceivable medium. I got it first from a friend’s Facebook status, and then from nearly every Facebook status and countless tweets and retweets. This morning my Google Reader was dominated by stories, articles, essays, and posts about Jobs and his legacy.

In one of those articles, Steven Levy’s reflections on Jobs’ life for Wired, I came across this intriguing passage that carried a great deal of wisdom:

Jobs usually had little interest in public self-analysis, but every so often he’d drop a clue to what made him tick. Once he recalled for me some of the long summers of his youth. I’m a big believer in boredom,” he told me. Boredom allows one to indulge in curiosity, he explained, and “out of curiosity comes everything.” The man who popularized personal computers and smartphones — machines that would draw our attention like a flame attracts gnats — worried about the future of boredom. “All the [technology] stuff is wonderful, but having nothing to do can be wonderful, too.”

I’m certain that you will come across countless other lines from Jobs in the coming days; many, I’m sure, will be taken from his now legendary 2005 commencement address at Stanford.

I have come rather late into the Apple fold, I’m typing this on my first Apple computer which was purchased just two months ago. But for longer than that I’ve been fascinated by the cult that has grown around Apple products over the last decade or so (perhaps longer, I’m not certain how to judge the years between Jobs’ two stints with the company in this regard). It is an uncanny phenomenon that has been noted and commented on many times. In recent months news outlets have run reports on studies that link the regard users have for Apple products with the same parts of the brain that have been related to religious experiences and to feelings of love.

It seems reasonably clear that Apple has tapped into something deeper than mere satisfaction with a consumer product. It also seems reasonably clear that the reactions to Steve Jobs’ untimely passing are at least in part wrapped up with the attachment users feel to the products he made possible. At least one Facebook status I read noted how odd it was to feel sadness for the passing of a man one had never met. This is not, of course, a previously unheard of phenomenon; from time to time the death of some public figure generates this sense of sadness and loss.

But it is not exactly common either. Numerous public figures die each year and most occasion little more than a mention and a sigh. Then there are those individuals whose passing generates grief and sorrow that ripples out far beyond the circle of family and friends who had known the person firsthand. Examples are not hard to come by: Abraham Lincoln, John Kennedy, Martin Luther King Jr., Marilyn Monroe, Elvis Presley, Princess Diana, Michael Jackson. I’m sure the list can be populated with other examples easily enough. The lives of all of these were ended prematurely and tragically, and they all managed to form emotional ties, each in their own way, with those who mourned their passing.

Now we may safely add Steve Jobs to this list and this raises some interesting questions. How does he fit in with this group and the others who may be added to their number? What was the source of the emotional bond? Whatever we might think of his genius, his vision, his determination — none of these seem to me to account for the emotional bond. The bond, it would seem, was not with the person of Steve Jobs in the same way that it was with the other individuals whose deaths spurred widespread and heartfelt public mourning.

The emotional bond, rather, is with the objects Steve Jobs envisioned and produced. The bond has been transferred to the man as the embodiment of our love affair with the products. It would not take long to confirm this anecdotally on Twitter. At both the announcement of his resignation in August and now his death, my Twitter feed was populated by mentions of how the products Jobs produced changed lives along with notes about how the very message of appreciation was made possible by an Apple product. This itself is an important index of our age.

And if we were to inquire further, we might note that the genius of his products lay finally in design. Jobs stands apart from both great inventors of the past and great corporate figures of the past. He was some blend of the two, to be sure, but added to the alchemy was a dash of the artist as well.

Apple’s success lay not only in its innovation, but also in its aesthetic. The heart is not so pragmatic that it loves what merely works. It loves beauty, and Jobs seems to have known that the consumer would flock to beautifully designed products. The beauty, of course, is of a certain character — minimalist, functional, clean — but it is a recognizable and appealing aesthetic.

It did not hurt either that Jobs moved Apple products into a symbiotic relationship with other objects of love, music and personal relationships. Music is itself a transcendent source of beauty and love. We love our music, and Jobs tapped that love when he made the iPod. Our love also flows naturally to our family and friends, and with the iPhone Jobs created a product that effortlessly mediated those relationships along with our music. Add to this the manner in which the “Touch” revolution Apple products initiated appropriated the visceral and embodied nature of our loves and affections and you begin to understand Jobs’ genius.

He seemed to have understood this above all else: the consumer was not the rational optimizer of classical economic theory. The consumer, who after all was a human being, was a lover and the lover loves the beautiful.