What’s Wrong with the News

Reading After Virtue, more than a few years ago now, was an important milestone in my intellectual journey.  Its author, moral philosopher Alasdair MacIntyre, has remained an influence on my thinking ever since.  In the pages of Prospect, John Cornwell recently reflected on a lecture MacIntyre delivered about our ongoing economic troubles.  In doing so, Cornwell offers a useful, and not uncritical, overview of MacIntyre’s career and the trajectory of his thought.  There are some interesting, and to my mind, compelling observations in Cornwell’s synopsis including the following:

MacIntyre maintains, however, that the system must be understood in terms of its vices—in particular debt. The owners and managers of capital always want to keep wages and other costs as low as possible. “But, insofar as they succeed, they create a recurrent problem for themselves. For workers are also consumers and capitalism requires consumers with the purchasing power to buy its products. So there is tension between the need to keep wages low and the need to keep consumption high.” Capitalism has solved this dilemma, MacIntyre says, by bringing future consumption into the present by dramatic extensions of credit.

This expansion of credit, he goes on, has been accompanied by a distribution of risk that exposed to ruin millions of people who were unaware of their exposure. So when capitalism once again overextended itself, massive credit was transformed into even more massive debt, “into loss of jobs and loss of wages, into bankruptcies of firms and foreclosures of homes, into one sort of ruin for Ireland, another for Iceland, and a third for California and Illinois.” Not only does capitalism impose the costs of growth or lack of it on those least able to bear them, but much of that debt is unjust. And the “engineers of this debt,” who had already benefited disproportionately, “have been allowed to exempt themselves from the consequences of their delinquent actions.” The imposition of unjust debt is a symptom of the “moral condition of the economic system of advanced modernity, and is in its most basic forms an expression of the vices of intemperateness, and injustice, and imprudence.”

One, of course, expects a moral philosopher closely associated with virtue ethics to judge the merits of an economic system according to the virtues or vices it encourages.  This is not all that can be said, however, as many will point out (including Cornwell), and I’m beginning to think capitalism is too vague and elastic a term to be useful in serious discussion.  Nonetheless, MacIntyre raises important concerns that we should take quite seriously.

In one of those moments of digitally enabled serendipity when linking from one item to another seemingly unrelated item one discovers an unexpected connection between the two, I followed the Prospect piece on MacIntyre with Ted Koppel’s much discussed Washington Post 0p-ed. In his piece, Koppel lamented the eclipse of dependable and objective news coverage by the sensationalized, partisan cable news programs of the Right and the Left. Koppel’s piece got passed around quite a bit online and seems to have struck a chord with those disillusioned by the conflation of news with entertainment.

Koppel’s argument is straightforward:

To the degree that broadcast news was a more virtuous operation 40 years ago, it was a function of both fear and innocence. Network executives were afraid that a failure to work in the “public interest, convenience and necessity,” as set forth in the Radio Act of 1927, might cause the Federal Communications Commission to suspend or even revoke their licenses. The three major broadcast networks pointed to their news divisions (which operated at a loss or barely broke even) as evidence that they were fulfilling the FCC’s mandate. News was, in a manner of speaking, the loss leader that permitted NBC, CBS and ABC to justify the enormous profits made by their entertainment divisions.

On the innocence side of the ledger, meanwhile, it never occurred to the network brass that news programming could be profitable.

Until, that is, CBS News unveiled its “60 Minutes” news magazine in 1968. When, after three years or so, “60 Minutes” turned a profit (something no television news program had previously achieved), a light went on, and the news divisions of all three networks came to be seen as profit centers, with all the expectations that entailed.

Perhaps Koppel has read After Virtue, it is interesting, at least, that he employs the language of virtue and innocence.  While he may not have his facts on network news and profitability quite right, Koppel is making an argument that illustrates one of the key concepts laid out by MacIntyre in After Virtue.  MacIntyre begins his argument for traditioned moral communities with the notion of a practice and goods that are either internal or external to it.  MacIntyre defines a practice as follows:

By a ‘practice’ I am going to mean any coherent and complex form of socially established cooperative human activity through which goods internal to that form of activity are realized in the course of trying to achieve those standards of excellence which are appropriate to, and partially definitive of, that form of activity, with the result that human powers to achieve excellence, and human conceptions of the ends and goods involved, are systematically extended. (After Virtue, 187)

There are certain goods or rewards that are inherent or natural to a practice, and there are other goods or rewards that may attach themselves to a practice but that are incidental and perhaps even inimical to the practice itself.  There is, for example, what we call playing for the “love of the game,” and there is playing for money.  It may be nostalgia, naivete, or romanticism, but we like to imagine that some, at least, play for the love of the game.

Koppel’s point then, in MacIntyre’s terms, was this:  Journalism lost its way when it began to be driven by the pursuit of external rather than internal goods.  Profit is not a good internal to the practice of journalism, although it clearly can be an external good.  But when the pursuit of that external good was injected into the practice of journalism it displaced the goods properly internal to the practice  distorting and corrupting it.  One can imagine a situation in which external and internal goods are properly ordered and prioritized so that they are both attained without compromising the integrity of the practice in question.  The problem tends to arise when the external goods take precedence over the internal goods making the whole activity mercenary.

There is a lot more to be said, I’m sure, about this specific case.  I’m in no position to judge the reliability of Koppel’s vision of the halcyon days of network news; most golden ages tend to reveal a good bit of tarnish upon closer inspection.  Yet, Koppel’s argument resonates with us (unless you are Keith Olbermann or Bill O’Reilly) because it gives voice to an unease we feel with the commodification of life and society.  Rejoinders about the inevitability and necessity of it all strike us as cynical and callous.

We may not use MacIntyre’s terminology, but we have a sense of the natural connection between internal goods and a practice.  Likewise, we are disappointed with those who pursue a practice merely for the sake of an external good — the athlete that plays only for money, the spouse who marries only for status, the politician who runs only for power.  Increasingly, we are noticing the introduction of the pursuit of profit as a good into realms of social and private life where it could be only an external good, and where it threatens to displace the internal goods, thus bringing distortion and corruption.

Warding off the consequences of large scale displacement of internal goods by economic rationality may appear a losing battle.  But acts of private resistance are not without consequence.  To borrow the title of one of Flannery O’Connor’s short stories, the life you save may be your own.

“The Past Is Never Dead”

In the wake of the Protestant Reformation, religious violence tore across Europe.  The Wars of Religion, culminating with the Thirty Years’ War, left the continent scarred and exhausted.  Out of ashes of war the secular nation state arose to establish a new political order which privatized religion and enshrined reason and tolerance as the currency of the public sphere ensuring an end to irrational violence.

That is one of the more familiar historical narratives that we tell ourselves.  It is sweeping and elegant in its scope and compelling in its explanatory power.  There’s only one problem according to William Cavanaugh:  it’s not true.  Cavanaugh lays out his case in his most recent book, The Myth of Religious Violence: Secular Ideology and the Roots of Modern Conflict (Oxford UP, 2009).  Needless to say, he has his work cut out for him.  The narrative he seeks to deconstruct is deeply entrenched and we’ve staked a lot on it.  His point, to be clear, is not that religion has never been implicated in violence.  As he puts it elsewhere, “Given certain conditions, Christianity, Islam, and other faiths can and do contribute to violence.”  Rather, he is contesting the particular historical narrative whereby the secular nation state arises in response to religious violence in order to secure peace for society by marginalizing religious practice and discourse.

To begin with, Cavanaugh demonstrates that the very concept of religion is problematic and thus renders any neat parsing of violence into either religious or secular categories tenuous at best.  Moreover, the nation state precedes the so called wars of religion and is best seen as a contributing cause, not an effect of the wars.  The historical realities of the wars resist the simplistic “Wars of Religion” schema anyway.  For example, during the  Thirty Years’ War, Catholics were at times fighting other Catholics and sometimes in league with Protestants.  The Thirty Years’ War as it turns out was a scramble by competing dynasties and rising national governments to fill the power vacuum created by the collapse of the medieval political order.  Furthermore, Cavanaugh suggests that the state co-opted and assumed for itself many of  the qualities of the church creating, as one reviewer put it, “its own sacred space, with its own rituals, hymns, and theology, and its own universal mission.”  In the end, the secular nation state, particularly in its 20th century totalitarian apotheosis, hardly appears as the champion of reason, peace, and tolerance.  The nation state secured its status by monopolizing the use of coercive force.  In doing so, however, it clearly did not put an end to violence.

Cavanaugh presents a counter-intuitive thesis and he takes care to make his case.  It is a case he has been working out since 1995 when, as a graduate student, he published “‘A fire strong enough to consume the house:’ The Wars of Religion and the Rise of the Nation State.” In the intervening years he has honed and strengthened his argument which finds mature expression in the The Myth of Religious Violence.

Whether one ultimately agrees with Cavanaugh’s thesis or not, his work highlights two important considerations regarding historical narratives.  First, historical reality is usually more complex than the stories we tell, and the complexity matters.  We are living through a cultural moment when historical awareness is a rare commodity, so perhaps we shouldn’t complain too much about shallow historical knowledge when the alternative may be no historical knowledge.  But that said, much of what does pass for historical knowledge too frequently is filtered through Hollywood, the entertainment industry, or the talk-show circuit, and for all these subtlety must necessarily be sacrificed to the demands of the medium.  The big picture sometimes is painted at the expense of important details, so much so that the big picture is rendered misleading.

Perhaps most days of the week, this is not a terribly important consideration.  But it can become very significant under certain circumstances.  When a historical narrative is hotly contested and passionately defended it is usually because the real battle is over the present.  Consider heated debates about the Christian or secular origins of the American constitutional order, or arguments over the causes of the American Civil War and Southern identity.  Leaving the terrain of American history, consider the Armenian genocide, the Japanese atrocities at Nanking, or the the tangled history of the Balkans.  In each case the real issue is clearly not the accuracy of our historical memory so much as it is the perceived implications for the present.  In other words, we fight for our vision for the present on the battlefield of the past.  This raises a host of other questions related to the status of arguments from history, philosophies of history, and historiography.  These sorts of questions, however, are rarely raised at rallies or on television — it would be hard to fit them on a placard or in a 10 second sound bite.

A debate about the origins of the modern nation state is likewise about more than historical accuracy.  Critics of religion and the place of religion in public life, Christopher Hitchens and Sam Harris for example, have made the historical narrative we began with a key component of their case — religion kills, the secular state saves.  Cavanaugh has offered the compelling rejoinder.

Either way it appears Faulkner was right:  “The past is never dead.  It’s not even past.”

_____

You can listen to a lecture and link to a number of essays by Cavanaugh here.

Clever Blog Post Title

Reading about Jon Stewart’s recent D. C. rally, I noticed a picture of a participant wearing a “V for Vendetta” mask and carrying a sign.  It wasn’t the mask, but the words on the sign that caught my eye.  The sign read, “ANGRY SIGN.”

I wouldn’t have paid much attention to it had I not recently crossed paths with a guy wearing a T-shirt that read:  “Words on a T-shirt.”  Both of these then reminded me of the very entertaining, “Academy Award Winning Trailer” — a self-referential spoof of movie trailers embedded at the end of this post.  The “trailer” opens with a character saying, “A toast — establishing me as the wealthy, successful protagonist.”  And on it goes in that vein.  The comments section on the clip’s Youtube page likewise yielded a self-referential parody of typical Youtube comments.  Example:  “Quoting what I think is the funniest part of the video and adding “LOL” at the end of the comment.”

There is a long tradition of self-reference in the art world, Magritte’s pipe is classic example, but now it seems the self-referentiality that was a mark of the avant-garde is an established feature of popular culture.  Call it vulgar self-referentiality if you like.  Our sophistication, in fact, seems to be measured by the degree of our reflexivity and self-awareness — “I know, that I know, that I know, etc.” — which elides with a mood of ironic detachment.   Earnestness in this environment becomes a vice.  We are in a sense “mediating” or re-presenting our own selves through this layer of reflexivity, and we’re pretty sure everyone else is too.

On the surface perhaps, there is a certain affinity with the Socratic injunction, “Know thyself.”  But this is meta-Socrates, “Knowingly know thyself.”  At issue for some is whether there is  a subject there to know that would localize and contain the knowing, or whether in the absence of a subject all there is to know is a process of knowing, knowing itself.

Leaning on Thomas de Zengotita’s Mediated: How the Media Shapes Your World and the Way You Live in It and Hannah Arendt’s The Human Condition, here’s a grossly oversimplified thumbnail genealogy of our heightened self-consciousness.  On the heels of the Copernican revolution, we lost confidence in the ability of our senses to apprehend the world truthfully (my eyes tell me the world is stationary and the sun moves, turns out my eyes lied), but following Descartes we sought certainty in our inner subjective processes — “I think, therefore I am” and all that.  Philosophically then our attention turned to our own thinking — from the object out there, to the subject in here.

Sociologically, modernity has been marked by an erosion of cultural givens; very little attains a taken-for-granted status relative to the pre-modern and early modern world.  In contrast to the medieval peasant, consider how much of your life is not pre-determined by cultural necessity.  Modernity then is marked by the expansion of choice, and choice ultimately foregrounds the act of choosing which yields an attendant lack of certainty in the choice – I am always aware that I could have chosen otherwise.  In other words, identity grounded in the objective (reified) structures of traditional society is superseded by an identity which is the aggregate of the choices we make — choice stands in for fate, nature, providence, and all the rest.  Eventually an awareness of this process throws even the notion of the self into question; I could have chosen otherwise, thus I could be otherwise.  The self, as the story goes, is decentered.  And whether or not that is really the case, it certainly seems to be what we feel to be the case.

So the self-referentialism that marked the avant-garde and the theories of social constructivism that were controversial a generation ago are by now old news.  They are widely accepted by most people under 35 who didn’t pick it all up by visiting the art houses or reading Derrida and company, but by living with and through the material conditions of their society.

First the camera, then the sound recorder, the video recorder, and finally the digitization and consequent democratization of the means of production and distribution (cheap cameras/Youtube, etc.) created a society in which we all know that we know that we are enacting multiple roles and that no action yields a window to the self, if by self we mean some essential, unchanging core of identity.  Foucault’s surveillance society has produced a generation of method actors. In de Zengotita’s words,

Whatever the particulars, to the extent that you are mediated, your personality becomes an extensive and adaptable tool kit of postures . . . You become an elaborate apparatus of evolving shtick that you deploy improvisationally as circumstances warrant.  Because it is all so habitual, and because you are so busy, you can almost forget the underlying reflexivity.

Almost.

There seems to be a tragically hip quality to this kind of hyper-reflexivity, but it is also like looking into a mirror image of a mirror — we get mired in an infinite regress of our own consciousness.  We risk self-referential paralysis, individually and culturally, and we experience a perpetual crisis of identity.  My sense is that a good deal of our cynicism and apathy is also tied into these dynamics.  Not sure where we go from here, but this seems to be where we are.

Or maybe this is all just me.

“The Things We Make, Make Us”

A while ago I posted about a rather elaborate Droid X commercial which featured a man’s arm morphing into a mechanical, cyborg arm from which the Droid phone then emerges.  This commercial struck me as a useful and vivid illustration of an important theoretical metaphor, deployed by Marshall McLuhan among others, to help us understand the way we relate to our technologies:  technology as prosthesis.

This weekend I came across another commercial that once again captured, intentionally or otherwise, an important element of our relationship with technology.  This time it was a commercial (which you can watch below) for the 2011 Jeep Grand Cherokee.  The commercial situates the new Grand Cherokee within the mythic narrative of American technology.  “The things that make us Americans,” we are told in the opening lines, “are the things we make.”  In 61 seconds flat we are presented with a series of iconic American technologies:  the railroad, the airplane, the steel mill, the cotton gin, the Colt .45, the skyscraper, the telegraph, the light bulb, and, naturally, the classic Jeep depicted in World War II era footage.  As if to throw in the mythical/American kitchen sink, at one point the image of a baseball hitting a catcher’s mitt is flashed on the screen.

(Never mind that a rather dark tale could be woven out of the history of the development and deployment of many of these technologies and that their sacred quality was lost on some rather consequential Americans including Nathaniel Hawthorne and Herman Melville.  For more on the place of technology in American history and culture take a look at David Nye’s American Technological Sublime and Leo Marx’s The Machine in the Garden: Technology and the Pastoral Ideal in America.)

In any case, the commercial closes with the line, “The things we make, make us.”  I suspect that this is an instance of someone speaking better than they know.  My guess is that the intended meaning is something like, “Making and inventing is part of what it means to be an American.”  We build amazing machines, that’s who we are.  But there is a deeper sense in which it is true that “the things we make, make us.”  We are conditioned (although, I would want to argue, not wholly determined) creatures.  That is to say that we are to some degree a product of our environment and that environment is shaped in important respects by the tools and technologies encompassed by it.

Nothing new here, this is a point made in one way or another by a number of observers and critics.  For example consider the argument advanced by Walter Ong in Orality and Literacy.  The technology of writing, Ong contends, transforms oral societies and the way its members experience the world.  Ong and others have explored the similar significance of printing to the emergence of modern society and modern consciousness.  Lewis Mumford famously suggested that it is to the invention of the clock that we owe the rise of the modern world and the particular disposition toward time that characterizes it.  Historians and social critics have also explored the impact of the steam engine, the car, the telephone, the radio, the television, and, most recently, the Internet on humans and their world.  Needless to say, we are who we are in part because of the tools that we have made and that now are in turn making us.  And as I’ve noted before, Katherine Hayles (and she is not alone) goes so far as to suggest that as a species we have “codeveloped with technologies; indeed, it is no exaggeration,” she writes in Electronic Literature, “to say modern humans literally would not have come into existence without technology.”

Now this may be a bit more than what Jeep had in mind, but thanks to their commercial we are reminded of an interesting and important facet of the human condition.

Medium Matters

“The medium is the message.” Or so Marshall McLuhan would have it.  The idea behind the catchy line is simple:  the medium is at least as significant, if not more so, as the content of a message.  In Understanding Media, McLuhan puts it this way:

Our conventional response to all media, namely that it is how they are used that counts is the numb stance of the technological idiot.  For the “content” of a medium is like the juicy piece of meat carried by the burglar to distract the watchdog of the mind.  (UM, 18)

Or, in case that wasn’t straightforward enough,

The content or message of any particular medium has about as much importance as the stenciling on the casing of an atomic bomb. (The Essential Mcluhan, 238)

This has remained one of media studies guiding principles.  However, earlier this week, in a post titled “Content Matters”, Jonah Lehrer offers the following comments on an article in the journal Neuron:

One of the recurring themes in the article is that it’s that very difficult to generalize about “technology” in the abstract. We squander a lot of oxygen and ink worrying about the effects of “television” and the “internet,” but the data quickly demonstrates that these broad categories are mostly meaningless. When it comes to changing the brain, content is king. Here are the scientists:

In the same way that there is no single effect of ‘‘eating food,’’ there is also no single effect of ‘‘watching television’’ or ‘‘playing video games.’’ Different foods contain different chemical components and thus lead to different physiological effects; different kinds of media have different content, task requirements,and attentional demands and thus lead to different behavioral effects.

You can read the study, “Children, Wired: For Better or for Worse,” online.  The article makes the case that different content presented by the same medium will impact children in different ways.  So, for example, children who watch Sesame Street test better for literacy than do children who watch Teltubbies.  The report also concluded that while media that was intended to be educational, such as Baby Einstein videos, can some times have detrimental consequences, media that were intended for entertainment, such as action video games, could sometimes yield positive educational outcomes.  On that note, Lehrer quoted the following excerpt:

A burgeoning literature indicates that playing action video games is associated with a number of enhancements in vision, attention, cognition, and motor control. For instance, action video game experience heightens the ability to view small details in cluttered scenes and to perceive dim signals, such as would be present when driving in fog (Green and Bavelier, 2007; Li et al., 2009). Avid players display enhanced top-down control of attention and choose among different options more rapidly (Hubert-Wallander et al., 2010; Dye et al., 2009a). They also exhibit better visual short-term memory (Boot et al., 2008; Green and Bavelier, 2006), and can more flexibility switch from one task to another (Boot et al., 2008; Colzato et al., 2010; Karle et al., 2010).

Now perhaps I’m being somewhat of a curmudgeon, but it seems to me that, a heightened ability to drive in the fog notwithstanding, most of this amounts to saying that people who play video games get better at the skills needed to play video games.  All in all, I think we might prefer that people learn to make certain kinds of decision more deliberately, rather than more rapidly.  In any case, the article goes on to conclude that more research is needed and that researchers are just now beginning to get their footing in the field.

The point Lehrer seizes on, that content matters, is true enough.  I don’t know too many people who would argue that all content on any given media is necessarily equal.  However, this is not to say that the content is all that matters.  The studies cited by the article focused on different content within the same medium, but what of those who don’t use the medium at all compared to those who do regardless of the content they receive.  In other words, is there more of a difference between those who grow up watching television and those who don’t than there is between those who watch two different kinds of television programs?  Unless I missed something, the article (and the studies it cites) does not really address that issue.

By way of contrast, in “How to Raise Boys That Read,” Thomas Spence cites a study that seems to get at that question:

Dr. Robert Weis, a psychology professor at Denison University, confirmed this suspicion in a randomized controlled trial of the effect of video games on academic ability. Boys with video games at home, he found, spend more time playing them than reading, and their academic performance suffers substantially. Hard to believe, isn’t it, but Science has spoken.

The secret to raising boys who read, I submit, is pretty simple—keep electronic media, especially video games and recreational Internet, under control (that is to say, almost completely absent). Then fill your shelves with good books.

Ignore the unfortunate “Science has spoken” bit — I’m not sure what the capitalization is supposed to suggest anyway — and notice that this study is considering not differences in content within a medium (which is not insignificant), but differences between media.

To use a taxonomy coined by Joshua Meyrowitz, the first study focuses on media as conduits or vessels that merely transmit information.  On this model the vessel is less important than the content being transmitted.  There is certainly a place for this kind of analysis, but there is usually more going on.  Meyrowitz encourages us to look at media not only as conduits, but as environments that have significant consequences beyond the particular effects of the content.  As Meyrowitz puts it,

Of course media content is important, especially in the short term. Political, economic, and religious elites have always attempted to maintain control by shaping the content of media . . . But content questions alone, while important, do not foster sufficient understanding of the underlying changes in social structures encouraged or enabled by new forms of communication.

Content matters, but so does the medium (arguably more so).