MOOCs: Market Solution for Market Problems?

Of the writing about MOOCs there seems to be no end … but here is one more thought. Many of the proponents of MOOCs and online education* in general couch their advocacy in the language of competition, market forces, disrupting** the educational cartels***, etc. They point, as well, to the rising costs of a college education — costs which, in their view, are unjustifiable because they do not yield commensurate rewards.**** MOOCs introduce innovation, efficiency, technological entrepreneurship, cost-effectiveness and they satisfy consumer demand. The market to the rescue. But when you look closely at the named culprits these proponents of MOOCs blame for rising costs, they include frivolous spending on items that are not directly related to education such as luxury dorms, massive sporting complexes, state-of-the-art student recreational centers, out-of-control administrative expansion, etc. But why are colleges spending so much money on such things? Because the market logic triumphed. Students became consumers and colleges had to compete for them by offering a host of amenities that, in truth, had little to do with education. Market solutions for market problems, then.***** MOOCs are just the extension of a logic that triumphed long ago. On this score, I’d suggest the universities need to recover something of their medieval heritage, rather than heed the siren songs of the digital age.

_______________________________________________

*I resisted, in the name of the semblance of objectivity, the urge to place ironical quotation marks around education.

**I resisted, in the name of decency, the urge to curse out loud as I typed disruption.

***Heretofore known as universities.

****These rewards are, of course, always understood to be pecuniary.

*****This is the hidden genius of disruption, brought to you by many of the same folks who gave us technological fixes for technological problems.

Eternal Sunshine of the Spotless Digital Archive

The future is stubbornly resistant to prediction, but we try anyway. I’ve been thinking lately about memory, technologies that mediate our memories, and the future of the past.

The one glaring fact — and I think it is more or less incontrovertible — is this: Digital technology has made it possible to capture and store vast amounts a data.

Much of this data, for the average person, involves the documentation of daily life. This documentation is often photographic or audio-visual.

What difference will this make? Recently, I suggested that in an age of memory abundance, memory will be devalued. There will be too much of it and it will be out there somewhere — in a hard drive, on my phone/computer,  or in the cloud. As we confidently and routinely outsource our remembering to digital devices and archives, we will grow relatively indifferent to personal memories. (Although, I don’t think indifferent is the best word — perhaps unattached.)

This too seems to me incontrovertible. It is the overlooked truth in Plato’s much-maligned critique of writing. Externalized memory is only figuratively related to internalized memory.

But I was assuming the permanence of these digital memories. What if our digital archives prove to be impermanent? What if in the coming years and decades we realize that our digital memories are gradually fading into oblivion?

Consider the following from Bruce Sterling: “Actually it’s mostly the past’s things that will outlive us. Things that have already successfully lived a long time, such as the Pyramids, are likely to stay around longer than 99.9% of our things. It might be a bit startling to realize that it’s mostly our paper that will survive us as data, while a lot of our electronics will succumb to erasure, loss, and bit rot.”

It might turn out that Snapchat is a premonition. What then?

Scenario A: Digital memory decay is a technical problem that is eventually solved; trajectory of memory abundance and consequent indifference plays out.

Scenario B: Digital memory decay remains a persistent problem.

Scenario B1: We devote ourselves to rituals of digital memory preservation. Therapy first referred to the care of the gods. We think of it as care for the self, sometimes involving the recollection repressed memories. Perhaps in the future these senses of the word will mutate into therapy understood as the care of our digital memories.

Scenario B2: By the time the problem of digital memory decay is recognized as a threat, we no longer care. Memory, we decide, is a burden. Mutually reinforcing decay and indifference then yield a creeping amnesia of long term memory. Eternal sunshine indeed.

Scenario B3: We reconsider our digital dependence and reintegrate analog and internalized forms of memory into our ecology of remembrance.

Scenario C: All of this is wrong.

In truth, I can hardly imagine a serious indifference to personal memory. But then again, I’m sure those who lived in societies whose cultural forms were devoted to tribal remembrance could hardly imagine serious indifference to the memory of the tribe. They probably couldn’t imagine someone caring much about their individual history; it was likely an incoherent concept. Thinking about the future involves the thinking of that which we can’t quite imagine, or is it the imagining of that which we can’t quite think. In any case, it’s not really about the future anyway. It’s about trying to make some sense of forces now at work and trying to reckon with the long reach of the past, which, remembered or not, will continue to make itself felt in the present.

Illuminating Contrasts

Patrick Leigh Fermor was widely regarded to be the best travel writer of the last century. He lived what was by all accounts a remarkably full life that included, to mention just two of the more striking episodes, a dalliance with a Romanian princess and the successful kidnapping of a German general in occupied Crete. Leigh Fermor struck up a lasting friendship with that same general when, as if his life were an Errol Flynn film, he completed a line from Horace in Latin begun by the German in the midst of his capture.

Among Leigh Fermor’s storied travels, there was a period of several months in 1957 spent as a guest in a series of monasteries while he worked on his writing. He tells of his time in the monasteries in a small book, considered by many to be his best, A Time to Keep Silence.

Leigh Fermor began with a visit to the Abbey of St. Wandrille de Fontanelle in the northwest of France. While there, he experienced a profound recalibration of the rhythm and pace of life. Of his first days he writes:

“My first feelings in the monastery changed: I lost the sensation of circumambient and impending death, of being by mistake locked up in a catacomb. I think the alteration must have taken about four days. The mood of dereliction persisted some time, a feeling of loneliness and flatness that always accompanies the transition from urban excess to a life of rustic solitude. Here in the Abbey, in absolutely unfamiliar surroundings, this miserable bridge-passage was immensely widened.”

We have vague ideas about the monastic life, when we think of it at all, and we know that it must be quite different from our own, but, Leigh Fermor insists, “only by living for a while in a monastery can one quite grasp its staggering difference from the ordinary life that we lead.”

In his view, “The two ways of life do not share a single attribute; and the thoughts, ambitions, sounds, light, time and mood that surround the inhabitants of a cloister are not only unlike anything to which one is accustomed, but in some curious way, seem its exact reverse. The period during which normal standards recede and the strange new world becomes reality is slow, and, at first, acutely painful.”

“To begin with,” he continues, “I slept badly at night and fell asleep during the day, felt restless alone in my cell and depressed by the lack of alcohol, the disappearance of which had caused a sudden halt in the customary monsoon.”

He then goes on to explain a remarkable alteration in his circadian rhythm. It’s worth quoting him at length at this point. Consider this carefully:

“The most remarkable preliminary symptoms were the variations of my need of sleep. After initial spells of insomnia, nightmare and falling asleep by day, I found that my capacity for sleep was becoming more and more remarkable: till the hours I spent in or on my bed vastly outnumbered the hours I spent awake; and my sleep was so profound that I might have been under the influence of some hypnotic drug. For two days, meals and the offices in the church — Mass, Vespers and Compline — were almost my only lucid moments. Then began an extraordinary transformation: this extreme lassitude dwindled to nothing; night shrank to five hours of light, dreamless and perfect sleep, followed by awakenings full of energy and limpid freshness. The explanation is simple enough: the desire for talk, movements and nervous expression that I had transported from Paris found, in this silent place, no response or foil, evoked no single echo; after miserably gesticulating for a while in a vacuum, it languished and finally died for lack of any stimulus or nourishment. Then the tremendous accumulation of tiredness, which must be the common property of all our contemporaries, broke loose and swamped everything. No demands, once I had emerged from that flood of sleep, were made upon my nervous energy: there were no automatic drains, such as conversation at meals, small talk, catching trains, or the hundred anxious trivialities that poison everyday life. Even the major causes of guilt and anxiety had slid away into some distant limbo and not only failed to emerge in the small hours as tormentors but appeared to have lost their dragonish validity.”

Reading Leigh Fermor’s account of his transition from the rhythms of ordinary life to the rhythms of monastic life reminded me of the testimonials one hears every so often from those who have undertaken a digital fast, a technology Sabbath, or a digital detox. There is, after all, something rather ascetic about most of that terminology. I was especially reminded of a 2010 story about five neuroscientists that made an experiment of a week-long rafting trip in Utah. David Strayer, one of the scientists in the party, coined the term “third-day syndrome” to describe  the shifts in attitude and behavior that begin to manifest themselves after three days of being “unplugged” — quite near the time Leigh Fermor suggests it took him to acclimate to life in the monastery, about four days.

One of the neuroscientists added, “Time is slowing down.” Another wondered out loud: “If we can find out that people are walking around fatigued and not realizing their cognitive potential … What can we do to get us back to our full potential?”

Leigh Fermor put this point more eloquently when he described “the tremendous accumulation of tiredness, which must be the common property of all our contemporaries.” Remember, he was writing in the late 1950s.

Whatever we think of the peculiar forms of fatigue that might be associated with the character of “digital life,” Leigh Fermor’s account of his journey into the silence of the monastic houses more than half a century ago reminds us that, if our bodies and minds are to be trusted, there has been something amiss with the way we order our lives since long before the advent of the Internet and smartphones.

I write that with some trepidation. Too often the identification of some historical precedent or antecedent to a modern malaise leads some to conclude something like, “Well, you see people in the past have felt this too and they survived, so this must not really be a problem.” There ought to be a name for that sort of fallacy. Perhaps there is one and I’m unaware of it. I should, in fact, list the deployment of this line of reasoning as a symptoms of a Borg Complex. In any case, it is shallow solace.

I rather think that such comparisons point us not to non-problems, but to perennial problems that take historically specific form and that each generation must address for itself. Leigh Fermor’s monastic retreat provided a ground against which the figure of mid-twentieth century urban life came into sharper relief.

Spans of time spent materially disconnected from the Internet may serve a similar function today. They may heighten our understanding of what is now the character of our ordinary life. It is its ordinariness, after all, that keeps us from seeing it clearly. Without a contrasting ground the figure does not appear at all, and the contrast may point out to us where we’ve been taken into patterns and habits that work against our well-being.

From Memory Scarcity to Memory Abundance

The most famous section in arguably the most famous book about photography, Roland Barthes’ Camera Lucida, dwells on a photograph of Barthes’ recently deceased mother taken in a winter garden when she was a little girl. On this picture, Barthes hung his meditative reflections on death and photography. The image evoked both the “that-has-been” reality of the subject, and the haunting “this-will-die” realization. That one photograph of his mother is also the only image discussed by Barthes that was not reproduced in Camera Lucida. It was too personal. It conveyed something true about his mother, but only to him.

But what if Barthes had not a few, but hundreds or even thousands of images of his mother?

I’ve long thought that what was most consequential about social media was their status as prosthetic memories. A site like Facebook, for example, is a massive archive of externalized memories preserved as texts and images. For this reason, it seemed to me, it would be unbearably hard to abandon such sites, particularly for those who had come of age with and through them. These archives bore too precious a record of the past to be simply deleted with a few clicks. I made this argument as late as last night.

But now I’ve realized that I had not fully appreciated the most important dynamic at play. I was operating with assumptions that were formed during an age of relative memory scarcity, but digital photography and sites like Facebook have brought us to an age of memory abundance. The paradoxical consequence of this development will be the progressive devaluing of such memories and severing of the past’s hold on the present. Gigabytes and terabytes of digital memories will not make us care more about those memories, they will make us care less.

We’ve seen the pattern before. Oral societies which had few and relatively inefficient technologies of remembrance at their disposal, lived to remember. Their cultural lives were devoted to ritual and liturgical acts of communal remembering. The introduction of writing, a comparably wondrous technology of remembrance, gradually released the individual from the burdens of cultural remembrance. Memory that could be outsourced, as we say, or offloaded could also be effectively forgotten by  the individual who was free to remember their own history. And it has been to this task that subsequent developments in the technology of remembrance have been put to use. The emergence of cheap paper coupled with rising rates of literacy gave us the diary and the boxes of letters. Photography and the film were also put to the task of documenting our lives. But until recently, these technologies were subject to important constraints. The recording devices were bulky and cumbersome and they were limited in capacity by the number of exposures in a film and the length of ribbon in a tape. There were also important practical constraints on storage and access. Digital technologies have burst through these constraints and they have not yet reached their potential.

Now we carry relatively unobtrusive devices of practically unlimited recording capacity, and these are easily linked to archives that are likewise virtually unlimited in their capacity to store and organize these memories. If we cast our vision into the not altogether distant nor fantastical future, we can anticipate individuals engaging with the world through devices (e.g., Google Glass) that will both augment the physical world by layering it with information and generate a near continuous audio-visual record of our experience.

Compared to these present and soon-to-be technologies, the 35mm camera which was at my disposal through the ’80s and ’90s seems primitive. With regards to a spectrum indicating the capacity to document and archive memories, I was then closer to my pre-modern predecessors than to the generation that will succeed me.

Roland Barthes’ near mystical veneration of his mother’s photograph, touching as it appears to those of us who lived in the age of memory scarcity, will seem quixotic and quaint to those who have known only memory abundance. Barthes will seem to them as those medievals that venerated the physical book do to us. They will be as indifferent to the photograph, and the past it encodes, as we are to the cheap paperback.

It may seem, as it did to me, that social media revived the significance of the past by reconnecting us with friends we would have mostly forgotten and reconstituting habits of social remembering. I’d even expressed concerns that social media might allow the past to overwhelm the present rendering recollection rather than suppression traumatic. But this has only been an effect of novelty upon that transitional generation who had lived without the technology and upon whom it appeared in medias res. For those who have known only the affordances of memory abundance, there will be no reconnection with long forgotten classmates or nostalgic reminiscences around a rare photograph of their youth capturing some trivial, unremembered moment. It will all be documented and archived, but it will mean not a thing.

It will be Barthes’ contemporary, Andy Warhol, who will appear as one of us. In his biography of Warhol, Victor Bockris writes,

Indeed, Andy’s desire to record everything around him had become a mania.  As John Perrault, the art critic, wrote in a profile of Warhol in Vogue:  “His portable tape recorder, housed in a black briefcase, is his latest self-protection device.  The microphone is pointed at anyone who approaches, turning the situation into a theater work.  He records hours of tape every day but just files the reels away and never listens to them.”

Andy Warhol’s performance art will be our ordinary experience, and it is that last line that we should note — “… and he never listens to them.”

Reconsider Plato’s infamous critique of writing. Critics charge Plato with shortsightedness because he failed to see just how much writing would in fact allow us to remember. But from a different perspective, Plato was right. The efficient and durable externalization of memory would makes us personally indifferent to remembrance. As the external archive grows, our personal involvement with the memory it stores shrinks in proportion.

Give me a few precious photographs, a few minutes of grainy film and I will treasure them and hold them dear. Give me one terabyte of images and films and I will care not at all.

In the future, we will float in the present untethered from the past and propelled listlessly onward by the perpetual stream of documentary detritus we will emit.

Sergey Brin, co-founder of Google appear

Macro-trends in Technology and Culture

Who would choose cell phones and Twitter over toilets and running water? Well, according to Kevin Kelly, certain rural Chinese farmers. In a recent essay exploring the possibilities of a post-productive economy, Kelly told of the remote villages he visited in which locals owned cell phones but lived in houses without even the most rudimentary forms of plumbing. It is a choice, Kelly notes, deeply influenced by tradition and culture. Kelly’s point may not be quite unassailable, but it is a fair reminder that technology is a culturally mediated phenomenon.

There are, generally speaking, two schools of thought on the relationship between technology and culture. Those tending toward some variety of technological determinism would argue that technology drives culture. Those who tend toward a social constructivist view of technology would argue the opposite. Ultimately, any theory of technology must account for the strengths and weaknesses of both of these tendencies. In fact, the framing of the relationship is probably problematic anyway since there are important ways in which technology is always cultural and culture is always technological.

For the purposes of this post, I’d like to lean toward the social constructivist perspective. No technology appears in a vacuum. It’s origins, evolution, adoption, deployment, and diffusion are all culturally condition. Moreover, the meaning of any technology is always culturally determined; it is never simply given in the form of the technology itself. Historians of technology have reminded us of this reality in numerous fascinating studies — studies of the telephone, for example, and the airplane, the electric grid, household technologies, and much else besides. When a new technology appears, it is interpreted and deployed within an already existing grid of desires, possibilities, necessities, values, symbols, expectations, and constraints. That a technology may re-order this grid in time does not negate the fact that it must first be received by it. The relationship is reciprocal.

If this is true, then it seems to me that we should situate our technologies not only within the immediate historical and social context of their genesis, but also within broader and more expansive historical trajectories. Is our use of computer technology, for example, still inflected by Baconian aspirations? What role do Cartesian dualisms play in shaping our relationship with the world through our technologies? To what degree does Christian eschatology inform technological utopianism? These seem to be important questions, the answers to which might usefully inform our understanding of the place of technology in contemporary society. Of course, these particular questions pertain especially to the West. I suspect another set of questions would apply to non-Western societies and still further questions would be raised within the context of globalization. But again, the basic premise is simply this: a given technology’s social context is not necessarily bounded by its immediate temporal horizons. We ought to be taking the long view as well.

But the rhythms of technological change (and the logic of the tech industry) would seem to discourage us from taking the long view, or at least the long view backwards in time. The pace of technological change over the last two hundred years or so has kept us busy trying to navigate the present, and its trajectory, real and ideal, casts our vision forward in the direction of imagined futures. But what if, as Faulkner quipped, the past with regards to technology is not dead or even past?

I’m wondering, for instance, about these large motive forces that have driven technological innovation in the West, such as the restoration of Edenic conditions or the quest for rational mastery over the natural world leading to the realization of Utopia. These early modern and Enlightenment motive forces directed and steered the evolution of technology in the West for centuries, and I do not doubt that they continue to exert their influence still. Yet, over the last century and half Western society has undergone a series of profound transformations. How have these shaped the evolution of technology? (The inverse question is certainly valid as well.) This is, I suppose, another way of asking about the consequences of post-modernity (which I distinguish from postmodernism) for the history of technology.

In a provocative and compelling post a few months ago, Nick Carr drew an analogy between the course of technological innovation and Maslow’s hierarchy of needs:

“The focus, or emphasis, of innovation moves up through five stages, propelled by shifts in the needs we seek to fulfill. In the beginning come Technologies of Survival (think fire), then Technologies of Social Organization (think cathedral), then Technologies of Prosperity (think steam engine), then technologies of leisure (think TV), and finally Technologies of the Self (think Facebook, or Prozac).”

I continue to find this insightful, and I think the angle I’m taking here dovetails with Carr’s analysis. The technologies of Prosperity and Leisure correspond roughly to the technologies of modernity. Technologies of the Self correspond roughly to the technologies of post-modernity. Gone is our faith in les grands récits that underwrote a variety of utopian visions and steered the evolution of technology. We live in an age of diminished expectations; we long for the fulfillment of human desires writ small. Self-fulfillment is our aim.

This is, incidentally, a trajectory that is nicely illustrated by Lydia DePillis’ suggestion that the massive Consumer Electronics Show “is what a World’s Fair might look like if brands were more important than countries.” The contrast between the world’s fairs and the CES is telling. The world’s fairs, especially those that preceded the 1939 New York fair, were quite obviously animated by thoroughly modern ideologies. They were, as President McKinley put it, “timekeepers of progress,” and one might as well capitalize Progress. On the other hand, whatever we think of the Consumer Electronics Show, it is animated by quite different and more modest spirits. The City of Tomorrow was displaced by the entertainment center of tomorrow before giving way to the augmented self of tomorrow.

Why did technological innovation take this path? Was it something in the nature of technology itself? Or, was it rather a consequence of larger sea changes in the character of society? Maybe a little of both, but probably more of the latter. It’s possible, of course, that this macro-perspective on the the co-evolution of culture and technology can obscure important details and result in misleading generalizations, but if those risks can be mitigated, it may also unveil important trends and qualities that would be invisible to more narrowly focused analysis.