Don’t Be a Relay in the Network

Back when the Machine was the dominant technological symbol, a metaphor arose to articulate the fear that individual significance was being sacrificed to large-scale, impersonal social forces: it was the fear of becoming “a cog in the machine.”

The metaphor is in need of an update.

This train of thought (speaking of archaic metaphors) began when I read the following paragraph from Leon Wieseltier’s recent commencement address at Brandeis University:

In the digital universe, knowledge is reduced to the status of information. Who will any longer remember that knowledge is to information as art is to kitsch-–that information is the most inferior kind of knowledge, because it is the most external? A great Jewish thinker of the early Middle Ages wondered why God, if He wanted us to know the truth about everything, did not simply tell us the truth about everything. His wise answer was that if we were merely told what we need to know, we would not, strictly speaking, know it. Knowledge can be acquired only over time and only by method.

It was that last phrase that stayed with me: knowledge can only be acquired by time and method. I was already in fundamental agreement with Wieseltier’s distinction between information and knowledge, and his prescription of time and method as the path toward knowledge also seemed just about right.

It also seemed quite different than what ordinarily characterized my daily encounter with digital information. For the most part, I’m doing well if I keep on top of all that comes my way each day through a variety of digital channels and then pass along – via this blog, Twitter, FB, or now Tumblr – items that I think are, or ought to be of interest to the respective audiences on each of those platforms. Blog, reblog. Like, share. Tweet, retweet. Etc., etc., etc.

Read, then discard or pass along. Repeat. That’s my default method. It’s not, I suspect, what Wieseltier had in mind. There is, given the sheer volume of information one takes in, a veneer of learnedness to these habits. But there is, in fact, very little thought involved, or judgment. Time under these circumstances is not experienced as the pre-condition of knowledge, it is rather the enemy of relevance. The meme-cycle, like the news-cycle is unforgivingly brief. And method – understood as the deliberate, sustained, and, yes, methodical pursuit of deep understanding of a given topic – is likewise out of step with the rhythms of digital information.

Of course, there is nothing about digital technology that demands or necessitate’s this kind of relationship to information or knowledge. But while it is not demanded or necessitated, it is facilitated and encouraged. It is always easier to attune oneself to the dominant rhythms than it is to serve as the counterpoint. And what the dominant rhythm of digital culture encourages is not that we be cogs in the machine, but rather relays in the network.

We are relays in a massive network of digital information. Information comes to me and I send it out to you and you pass it along to someone else, and so on, day in and day out, moment by moment. In certain circles it might even be put this way: we are neurons within a global mind. But, of course, there is no global mind in any meaningful sense that we should care about. It is a clever, fictive metaphor bandied about by pseudo-mystical techno-utopians.

The minds that matter are yours and mine, and their health requires that we resist the imperatives of digital culture and re-inject time and method into our encounters with information. It begins, I think, with a simple “No” to the impulse to quickly skim-read and either share or discard. May be even prior to this, we must also renounce the tacit pressure to keep up with it all (as if that were possible anyway) and the fear of missing out. And this should be followed by a willingness to invest deep attentiveness, further research, and even contemplation over time to those matters that call for it. Needless to say, not all information justifies this sort of cognitive investment. But all of us should be able to transition from the nearly passive reception and transmission of information to genuine knowledge when it is warranted.

At their best, digital technologies offer tremendous resources to the life of the mind, but only if we cultivate the discipline to use these technologies against their own grain.


This paragraph is from yet another Thomas Friedman op-ed gushing over the revolutionary, disruptive, transformational possibilities MOOCs present:

“Therefore, we have to get beyond the current system of information and delivery — the professorial “sage on the stage” and students taking notes, followed by a superficial assessment, to one in which students are asked and empowered to master more basic material online at their own pace, and the classroom becomes a place where the application of that knowledge can be honed through lab experiments and discussions with the professor.”

Okay, now read the same paragraph with one tiny alteration:

“Therefore, we have to get beyond the current system of information and delivery — the professorial “sage on the stage” and students taking notes, followed by a superficial assessment, to one in which students are asked and empowered to master more basic material [from books] at their own pace, and the classroom becomes a place where the application of that knowledge can be honed through lab experiments and discussions with the professor.”

So what am I missing? Or, is it retrograde of me to ask?

It seems to me that the cheapest, most effective tool to fulfill the model he envisions may still be the book, not the MOOC.

MOOCs: Market Solution for Market Problems?

Of the writing about MOOCs there seems to be no end … but here is one more thought. Many of the proponents of MOOCs and online education* in general couch their advocacy in the language of competition, market forces, disrupting** the educational cartels***, etc. They point, as well, to the rising costs of a college education — costs which, in their view, are unjustifiable because they do not yield commensurate rewards.**** MOOCs introduce innovation, efficiency, technological entrepreneurship, cost-effectiveness and they satisfy consumer demand. The market to the rescue. But when you look closely at the named culprits these proponents of MOOCs blame for rising costs, they include frivolous spending on items that are not directly related to education such as luxury dorms, massive sporting complexes, state-of-the-art student recreational centers, out-of-control administrative expansion, etc. But why are colleges spending so much money on such things? Because the market logic triumphed. Students became consumers and colleges had to compete for them by offering a host of amenities that, in truth, had little to do with education. Market solutions for market problems, then.***** MOOCs are just the extension of a logic that triumphed long ago. On this score, I’d suggest the universities need to recover something of their medieval heritage, rather than heed the siren songs of the digital age.


*I resisted, in the name of the semblance of objectivity, the urge to place ironical quotation marks around education.

**I resisted, in the name of decency, the urge to curse out loud as I typed disruption.

***Heretofore known as universities.

****These rewards are, of course, always understood to be pecuniary.

*****This is the hidden genius of disruption, brought to you by many of the same folks who gave us technological fixes for technological problems.

Borg Complex Case Files 2

UPDATE: See the Borg Complex primer here.


Alright, let’s review. A Borg Complex is exhibited by writers and pundits whenever you can sum up their message with the phrase: “Resistance is futile.”

The six previously identified symptoms of a Borg Complex are as follows:

1. Makes grandiose, but unsupported claims for technology

2. Uses the term Luddite a-historically and as a casual slur

3. Pays lip service to, but ultimately dismisses genuine concerns

4. Equates resistance or caution to reactionary nostalgia

5. Starkly and matter-of-factly frames the case for assimilation

6. Announces the bleak future for those who refuse to assimilate

In an effort to further refine our diagnostic instruments, I am now adding two more related symptoms to the list. The first of these arises from a couple of cases submitted by Nick Carr and is summarized as follows:

7. Expresses contemptuous disregard for the achievements of the past

Consider this claim by Tim O’Reilly highlighted by Carr:

“I don’t really give a shit if literary novels go away. … the novel as we know it today is only a 200-year-old construct. And now we’re getting new forms of entertainment, new forms of popular culture.”

Well, there you have it. This same statement serves to illustrate the second symptom I’m introducing:

8. Refers to historical parallels only to dismiss present concerns.

This symptom identifies historical precedents as a way of saying, “Look, people worried about stuff like this before and they survived, so there’s nothing to be concerned about now.” It sees all concerns through the lens of Chicken Little. The sky never falls. I would suggest that the Boy Who Cried Wolf is the better parable. The wolf sometimes shows up.

To this list of now eight symptoms, we might also add two Causes.

1. Self-interest, usually commercial (ranging from more tempered to crass)

2. Fatalistic commitment to technological determinism (in pessimistic and optimistic varieties)

Now let’s consider some more cases related to education, a field that is particularly susceptible to Borg Complex claims. In a recent column, Thomas Friedman gushed over MOOCs:

“Nothing has more potential to lift more people out of poverty — by providing them an affordable education to get a job or improve in the job they have. Nothing has more potential to unlock a billion more brains to solve the world’s biggest problems. And nothing has more potential to enable us to reimagine higher education than the massive open online course, or MOOC, platforms that are being developed by the likes of Stanford and the Massachusetts Institute of Technology and companies like Coursera and Udacity.”

These are high, indeed utopian hopes, and in their support Friedman offered two heartwarming anecdotes of MOOC success. In doing so, he committed the error of tech-uptopians (modeled on what Pascal took to be the error of Stoicism): Believing the wonderful use to which a technology can be put, will be the use to which it is always put.

Friedman wrapped up his post with the words of M.I.T. president, L. Rafael Reif, “There is a new world unfolding and everyone will have to adapt.”

And there it is — Borg Complex.

In long piece in The Awl (via Nathan Jurgenson), Maria Bustillos discusses an online exchange on the subject of MOOCs between Aaron Bady and Clay Shirky. Take the time to read the whole thing if you’re curious/worried about MOOCs. But here is Shirky on the difference between he and Bady: “Aaron and I agree about most of the diagnosis; we disagree about the prognosis. He thinks these trends are reversible, and I don’t; Udacity could go away next year and the damage is already done.”

Once again — Borg Complex.

At this juncture it’s worth asking, as Jurgenson did in sending me this last case, “Is he wrong?”

Is it not the case that folks who exhibit a Borg Complex often turn out to be right? Isn’t it inevitable that their vision will play out regardless of what we say or do?

My initial response is … maybe. As I’ve said before, a Borg Complex diagnosis is neutral as to the eventual veracity of the claims. It is more about an attitude toward technology in the present that renders the claims, if they are realized, instances of self-fulfilling prophecy.

The appearance of inevitability is a trick played by our tendency to make a neat story out of the past. Historians know this better than most. What looks like inevitability to us is only a function of zooming so far out that all contingencies fade from view. But in the fine-grain texture of lived experience, we know that there are countless contingencies that impinge upon the unfolding of history. It is also a function of forgetfulness. If only we had a catalog of all that we were told was inevitable that didn’t finally transpire. We only remember successes and then we tell their story as if they had to succeed all along, and this lends claims of inevitability an undeserved plausibility.

One other consideration: It is always worth asking, Who stands to profit from tales of technological inevitability?

Consider the conclusion of a keynote address delivered at the Digital-Life-Design Conference in Munich by Max Levchin:

“So to conclude: I believe that in the next decades we will see huge number of inherently analog processes captured digitally. Opportunities to build businesses that process this data and improve lives will abound.”

The “analog processes” Levchin is talking about are basically the ordinary aspects of our human lives that have yet to be successfully turned into digital data points. Here’s Carr on the vision Levchin lays out:

“This is the nightmare world of Big Data, where the moment-by-moment behavior of human beings — analog resources — is tracked by sensors and engineered by central authorities to create optimal statistical outcomes. We might dismiss it as a warped science fiction fantasy if it weren’t also the utopian dream of the Max Levchins of the world. They have lots of money and they smell even more: ‘I believe that in the next decades we will see huge number of inherently analog processes captured digitally. Opportunities to build businesses that process this data and improve lives will abound.’ It’s the ultimate win-win: you get filthy rich by purifying the tribe.”

Commenting on the same address and on Carr’s critique, Alan Jacobs captures the Borg-ish nature of Levchin’s rhetoric:

“And Levchin makes his case for these exciting new developments in a very common way: ‘This is going to add a huge amount of new kinds of risks. But as a species, we simply must take these risks, to continue advancing, to use all available resources to their maximum.’ Levchin doesn’t really spell out what the risks are — though Nick Carr does — because he doesn’t want us to think about them. He just wants us to think about the absolutely unquestionable ‘must’ of using ‘all available resources to their maximum.’ That ‘advance’ in that direction might amount to ‘retreat’ in actual human flourishing does not cross his mind, because it cannot. Efficiency in the exploitation of ‘resources’ is his only canon.”

We “must” do this. It is inevitable but that this will happen. Resistance is futile.

Except that very often it isn’t. There are choices to be made. A figure no less identified with technological determinism than Marshall McLuhan reminded us: “There is absolutely no inevitability as long as there is a willingness to contemplate what is happening.” The handwaving rhetoric that I’ve called a Borg Complex is resolutely opposed to just such contemplation, and very often for the worst of motives. By naming it my hope is that it can be more readily identified, weighed, and rejected.

Elsewhere McLuhan said, “the only alternative is to understand everything that is going on, and then neutralize it as much as possible, turn off as many buttons as you can, and frustrate them as much as you can … Anything I talk about is almost certainly to be something I’m resolutely against, and it seems to me the best way of opposing it is to understand it, and then you know where to turn off the button.”

Understanding is a form of resistance. So remember, carry on with the work of intelligent, loving resistance where discernment and wisdom deem it necessary.


This is a third in series of Borg Complex posts, you can read previous installments here.

I’ve set up a tumblr to catalog instances of Borg Complex rhetoric here. Submissions welcome.


Writing, Academic and Otherwise

Passing through the process of academic professionalization is, in part, not unlike the process of learning a new language. It is, for example, the sort of process that might lead me to write discourse in place of language to conclude that first sentence. Learning this new language can be both an infuriating and exhilarating experience. At first, the new language mystifies, baffles, and frustrates; later, if one sticks to it and if this new language is not utter nonsense (as it may sometimes be), there is a certain thrill in being able to see and name previously unseen (because unnamed) and poorly understood dimensions of experience.

I suspect the younger one happens to be when this initiating process takes place, the more zealously one may take to this new language, allowing it to become the grid through which all experience is later comprehended. This is, on the whole, an unfortunate tendency. Another unfortunate tendency is that by which, over time, academics forget that theirs is a learned and often obscure language which they acquired only after months and possibly years of training. This is easily forgotten, perhaps because it is only metaphorically a new language. It is, if you are American, still English, but a peculiarly augmented (or deformed, depending on your perspective) form of the language.

This means, usually, that when academics (or academics in training) write, they write in a way that might not be easily assimilated by non-academics. This is, of course, entirely unrelated to intellect or ability (a point that is sometimes missed). A brilliant Spaniard, for instance, is no less brilliant for having never taken the time to learn Swahili. This is also a function of the tribal quality of academic life. One gets used to operating in the language of the tribe and sometimes forgets to adjust to accordingly when operating in other contexts.

Again, I think this is very often simply a matter of habit and forgetfulness, although, it is sometimes a matter of arrogance, self-importance, and other such traits of character.

I mention all of this because, if I were asked to verbalize why I write this blog, I would say that it was in part to translate the work of academics, critics, and theorists into a more accessible form so that their insights regarding the meaning and consequences of media and technology, so far as those insights were useful, might be more widely known. After all, the technologies I usually write about affect so many of us, academics and non-academics alike. Anyone who cares to think about how to navigate these technologies as wisely as possible should be able to encounter the best thinking on such matters in a reasonably accessible form. I don’t know, maybe there is a certain naïveté in that aspiration, but it seems worth pursuing.

I’m fairly certain, though, that I don’t always achieve this goal that I half-consciously maintain for what I do here. I’m writing this post mostly to remind myself of this aspiration and renew my commitment to it.

I should be clear, I’m talking neither about dumbing down what there is to know nor am I suggesting anything like condescension ought to be involved. The challenge is to maintain the depth of insight and to resist the over-simplification of complexity while at the same time avoiding the characteristics of academic language that tend to make it inaccessible. It’s a matter of not ignoring the non-academic reader while also taking them seriously.

I’m reminded of some comments that David Foster Wallace made regarding the purposes of literature. I’ve cited this passage before, quite some time ago, and it has stuck with me. It’s a bit long, but worth reading. Wallace is discussing literature with the interviewer, David Lipsky, and they are debating the relative merits of traditional literature and less traditional, more avant-garde writing:

Huh.  Well you and I just disagree.  Maybe the world just feels differently to us.  This is all going back to something that isn’t really clear:  that avant-garde stuff is hard to read.  I’m not defending it, I’m saying that stuff — this is gonna get very abstract — but there’s a certain set of magical stuff that fiction can do for us.  There’s maybe thirteen things, of which who even knows which ones we can talk about.  But one of them has to do with the sense of, the sense of capturing, capturing what the world feels like to us, in the sort of way that I think that a reader can tell “Another sensibility like mine exists.”  Something else feels this way to someone else.  So that the reader feels less lonely.

There’s really really shitty avant-garde, that’s coy and hard for its own sake.  That I don’t think it’s a big accident that a lot of what, if you look at the history of fiction — sort of, like, if you look at the history of painting after the development of the photograph — that the history of fiction represents this continuing struggle to allow fiction to continue to do that magical stuff.  As the texture, as the cognitive texture, of our lives changes.  And as, um, as the different media by which our lives are represented change.  And it’s the avant-garde or experimental stuff that has the chance to move the stuff along.  And that’s what’s precious about it.

And the reason why I’m angry at how shitty most of it is, and how much it ignores the reader, is that I think it’s very very very very precious.  Because it’s the stuff that’s about what it feels like to live.  Instead of being a relief from what it feels like to live.

Maybe it is ill-advised to make this comparison, but I think what Wallace has to say here, or at least the spirit of what he is saying can apply to academic work as well. It can also be a way of representing what it feels like to be alive. I tend to hold literature in rather high esteem, so I don’t think that non-fiction can really replicate the experience of more literary writing, but it can be useful in its own way. It can help make sense of experience. It can generate self-understanding. It can suggest new possibilities for how to make one’s way in the world.

It’s too late for new year’s resolutions, but I’m hoping to keep this goal more clearly in focus as I continue to write on here. You can tell me, if you’re so inclined, how well I manage. Cheers.