“The Past Is Never Dead”

In the wake of the Protestant Reformation, religious violence tore across Europe.  The Wars of Religion, culminating with the Thirty Years’ War, left the continent scarred and exhausted.  Out of ashes of war the secular nation state arose to establish a new political order which privatized religion and enshrined reason and tolerance as the currency of the public sphere ensuring an end to irrational violence.

That is one of the more familiar historical narratives that we tell ourselves.  It is sweeping and elegant in its scope and compelling in its explanatory power.  There’s only one problem according to William Cavanaugh:  it’s not true.  Cavanaugh lays out his case in his most recent book, The Myth of Religious Violence: Secular Ideology and the Roots of Modern Conflict (Oxford UP, 2009).  Needless to say, he has his work cut out for him.  The narrative he seeks to deconstruct is deeply entrenched and we’ve staked a lot on it.  His point, to be clear, is not that religion has never been implicated in violence.  As he puts it elsewhere, “Given certain conditions, Christianity, Islam, and other faiths can and do contribute to violence.”  Rather, he is contesting the particular historical narrative whereby the secular nation state arises in response to religious violence in order to secure peace for society by marginalizing religious practice and discourse.

To begin with, Cavanaugh demonstrates that the very concept of religion is problematic and thus renders any neat parsing of violence into either religious or secular categories tenuous at best.  Moreover, the nation state precedes the so called wars of religion and is best seen as a contributing cause, not an effect of the wars.  The historical realities of the wars resist the simplistic “Wars of Religion” schema anyway.  For example, during the  Thirty Years’ War, Catholics were at times fighting other Catholics and sometimes in league with Protestants.  The Thirty Years’ War as it turns out was a scramble by competing dynasties and rising national governments to fill the power vacuum created by the collapse of the medieval political order.  Furthermore, Cavanaugh suggests that the state co-opted and assumed for itself many of  the qualities of the church creating, as one reviewer put it, “its own sacred space, with its own rituals, hymns, and theology, and its own universal mission.”  In the end, the secular nation state, particularly in its 20th century totalitarian apotheosis, hardly appears as the champion of reason, peace, and tolerance.  The nation state secured its status by monopolizing the use of coercive force.  In doing so, however, it clearly did not put an end to violence.

Cavanaugh presents a counter-intuitive thesis and he takes care to make his case.  It is a case he has been working out since 1995 when, as a graduate student, he published “‘A fire strong enough to consume the house:’ The Wars of Religion and the Rise of the Nation State.” In the intervening years he has honed and strengthened his argument which finds mature expression in the The Myth of Religious Violence.

Whether one ultimately agrees with Cavanaugh’s thesis or not, his work highlights two important considerations regarding historical narratives.  First, historical reality is usually more complex than the stories we tell, and the complexity matters.  We are living through a cultural moment when historical awareness is a rare commodity, so perhaps we shouldn’t complain too much about shallow historical knowledge when the alternative may be no historical knowledge.  But that said, much of what does pass for historical knowledge too frequently is filtered through Hollywood, the entertainment industry, or the talk-show circuit, and for all these subtlety must necessarily be sacrificed to the demands of the medium.  The big picture sometimes is painted at the expense of important details, so much so that the big picture is rendered misleading.

Perhaps most days of the week, this is not a terribly important consideration.  But it can become very significant under certain circumstances.  When a historical narrative is hotly contested and passionately defended it is usually because the real battle is over the present.  Consider heated debates about the Christian or secular origins of the American constitutional order, or arguments over the causes of the American Civil War and Southern identity.  Leaving the terrain of American history, consider the Armenian genocide, the Japanese atrocities at Nanking, or the the tangled history of the Balkans.  In each case the real issue is clearly not the accuracy of our historical memory so much as it is the perceived implications for the present.  In other words, we fight for our vision for the present on the battlefield of the past.  This raises a host of other questions related to the status of arguments from history, philosophies of history, and historiography.  These sorts of questions, however, are rarely raised at rallies or on television — it would be hard to fit them on a placard or in a 10 second sound bite.

A debate about the origins of the modern nation state is likewise about more than historical accuracy.  Critics of religion and the place of religion in public life, Christopher Hitchens and Sam Harris for example, have made the historical narrative we began with a key component of their case — religion kills, the secular state saves.  Cavanaugh has offered the compelling rejoinder.

Either way it appears Faulkner was right:  “The past is never dead.  It’s not even past.”

_____

You can listen to a lecture and link to a number of essays by Cavanaugh here.

Clever Blog Post Title

Reading about Jon Stewart’s recent D. C. rally, I noticed a picture of a participant wearing a “V for Vendetta” mask and carrying a sign.  It wasn’t the mask, but the words on the sign that caught my eye.  The sign read, “ANGRY SIGN.”

I wouldn’t have paid much attention to it had I not recently crossed paths with a guy wearing a T-shirt that read:  “Words on a T-shirt.”  Both of these then reminded me of the very entertaining, “Academy Award Winning Trailer” — a self-referential spoof of movie trailers embedded at the end of this post.  The “trailer” opens with a character saying, “A toast — establishing me as the wealthy, successful protagonist.”  And on it goes in that vein.  The comments section on the clip’s Youtube page likewise yielded a self-referential parody of typical Youtube comments.  Example:  “Quoting what I think is the funniest part of the video and adding “LOL” at the end of the comment.”

There is a long tradition of self-reference in the art world, Magritte’s pipe is classic example, but now it seems the self-referentiality that was a mark of the avant-garde is an established feature of popular culture.  Call it vulgar self-referentiality if you like.  Our sophistication, in fact, seems to be measured by the degree of our reflexivity and self-awareness — “I know, that I know, that I know, etc.” — which elides with a mood of ironic detachment.   Earnestness in this environment becomes a vice.  We are in a sense “mediating” or re-presenting our own selves through this layer of reflexivity, and we’re pretty sure everyone else is too.

On the surface perhaps, there is a certain affinity with the Socratic injunction, “Know thyself.”  But this is meta-Socrates, “Knowingly know thyself.”  At issue for some is whether there is  a subject there to know that would localize and contain the knowing, or whether in the absence of a subject all there is to know is a process of knowing, knowing itself.

Leaning on Thomas de Zengotita’s Mediated: How the Media Shapes Your World and the Way You Live in It and Hannah Arendt’s The Human Condition, here’s a grossly oversimplified thumbnail genealogy of our heightened self-consciousness.  On the heels of the Copernican revolution, we lost confidence in the ability of our senses to apprehend the world truthfully (my eyes tell me the world is stationary and the sun moves, turns out my eyes lied), but following Descartes we sought certainty in our inner subjective processes — “I think, therefore I am” and all that.  Philosophically then our attention turned to our own thinking — from the object out there, to the subject in here.

Sociologically, modernity has been marked by an erosion of cultural givens; very little attains a taken-for-granted status relative to the pre-modern and early modern world.  In contrast to the medieval peasant, consider how much of your life is not pre-determined by cultural necessity.  Modernity then is marked by the expansion of choice, and choice ultimately foregrounds the act of choosing which yields an attendant lack of certainty in the choice – I am always aware that I could have chosen otherwise.  In other words, identity grounded in the objective (reified) structures of traditional society is superseded by an identity which is the aggregate of the choices we make — choice stands in for fate, nature, providence, and all the rest.  Eventually an awareness of this process throws even the notion of the self into question; I could have chosen otherwise, thus I could be otherwise.  The self, as the story goes, is decentered.  And whether or not that is really the case, it certainly seems to be what we feel to be the case.

So the self-referentialism that marked the avant-garde and the theories of social constructivism that were controversial a generation ago are by now old news.  They are widely accepted by most people under 35 who didn’t pick it all up by visiting the art houses or reading Derrida and company, but by living with and through the material conditions of their society.

First the camera, then the sound recorder, the video recorder, and finally the digitization and consequent democratization of the means of production and distribution (cheap cameras/Youtube, etc.) created a society in which we all know that we know that we are enacting multiple roles and that no action yields a window to the self, if by self we mean some essential, unchanging core of identity.  Foucault’s surveillance society has produced a generation of method actors. In de Zengotita’s words,

Whatever the particulars, to the extent that you are mediated, your personality becomes an extensive and adaptable tool kit of postures . . . You become an elaborate apparatus of evolving shtick that you deploy improvisationally as circumstances warrant.  Because it is all so habitual, and because you are so busy, you can almost forget the underlying reflexivity.

Almost.

There seems to be a tragically hip quality to this kind of hyper-reflexivity, but it is also like looking into a mirror image of a mirror — we get mired in an infinite regress of our own consciousness.  We risk self-referential paralysis, individually and culturally, and we experience a perpetual crisis of identity.  My sense is that a good deal of our cynicism and apathy is also tied into these dynamics.  Not sure where we go from here, but this seems to be where we are.

Or maybe this is all just me.

When Words and Action Part Company

I’ve not been one to jump on the Malcolm Gladwell bandwagon; I can’t quite get past the disconcerting hair.  That said, his recent piece in The New Yorker, “Small Change:  Why the revolution will not be tweeted,” makes a compelling case for the limits of social media when it comes to generating social action.

Gladwell frames his piece as a study in contrasts.  He begins by recounting the evolution of the 1960 sit-in movement that began when four freshmen from North Carolina A & T sat down and ordered coffee at the lunch counter of the local Woolworth’s and refused to move when the waitress insisted, “We don’t serve Negroes here.”  Within days the protest grew and spread across state lines and tensions mounted.

Some seventy thousand students eventually took part. Thousands were arrested and untold thousands more radicalized. These events in the early sixties became a civil-rights war that engulfed the South for the rest of the decade—and it happened without e-mail, texting, Facebook, or Twitter.

Almost reflexively now, the devotees of social media power will trot out the Twitter-enabled 2009 Iranian protests as an example of what social media can do.  Gladwell, anticipating as much, quotes Mark Pfeifle, a former national-security adviser, who believes that, “Without Twitter the people of Iran would not have felt empowered and confident to stand up for freedom and democracy.”  Pfeifle went so far as to call for Twitter’s nomination for the Nobel Peace Prize.  That is a bit of a stretch one is inclined to believe, and Gladwell explains why:

In the Iranian case … the people tweeting about the demonstrations were almost all in the West. “It is time to get Twitter’s role in the events in Iran right,” Golnaz Esfandiari wrote, this past summer, in Foreign Policy. “Simply put: There was no Twitter Revolution inside Iran.” The cadre of prominent bloggers, like Andrew Sullivan, who championed the role of social media in Iran, Esfandiari continued, misunderstood the situation. “Western journalists who couldn’t reach—or didn’t bother reaching?—people on the ground in Iran simply scrolled through the English-language tweets post with tag #iranelection,” she wrote. “Through it all, no one seemed to wonder why people trying to coordinate protests in Iran would be writing in any language other than Farsi.”

You can read the Foreign Policy article by Esfandiari Gladwell, “Misreading Tehran:  The Twitter Devolution,” online.   Gladwell argues that social media is unable to promote significant and lasting social change because they foster weak rather than strong-tie relationships.  Promoting and achieving social change very often means coming up against entrenched cultural norms and standards that will not easily give way.  And as we know from the civil rights movement, the resistance is often violent.  As Gladwell reminds us,

. . . Within days of arriving in Mississippi, three [Freedom Summer Project] volunteers—Michael Schwerner, James Chaney, and Andrew Goodman—were kidnapped and killed, and, during the rest of the summer, thirty-seven black churches were set on fire and dozens of safe houses were bombed; volunteers were beaten, shot at, arrested, and trailed by pickup trucks full of armed men. A quarter of those in the program dropped out. Activism that challenges the status quo—that attacks deeply rooted problems—is not for the faint of heart.

A subsequent study of the participants in the Freedom Schools was conducted by Doug McAdam:

“All  of the applicants—participants and withdrawals alike—emerge as highly committed, articulate supporters of the goals and values of the summer program,” he concluded. What mattered more was an applicant’s degree of personal connection to the civil-rights movement . . . . [P]articipants were far more likely than dropouts to have close friends who were also going to Mississippi. High-risk activism, McAdam concluded, is a “strong-tie” phenomenon.

Gladwell also goes on to explain why hierarchy, another feature typically absent from social media activism, is indispensable to successful movements while taking some shots along the way at Clay Shirky’s much more optimistic view of social media outlined in Here Comes Everybody: The Power of Organizing Without Organizations.

Not suprisingly, Gladwell’s piece has been making the rounds online the past few days. In response to Gladwell, Jonah Lehrer posted “Weak Ties, Twitter and the Revolution” on his blog The Frontal Cortex.  Lehrer begins by granting, “These are all worthwhile and important points, and a necessary correction to the (over)hyping of Twitter and Facebook.”  But he believes Gladwell has erred in the other direction.  Basing his comments on Mark Granovetter’s 1973 paper, “The Strength of Weak Ties,” Lehrer concludes:

. . . I would quibble with Gladwell’s wholesale rejection of weak ties as a means of building a social movement. (I have some issues with Shirky, too.) It turns out that such distant relationships aren’t just useful for getting jobs or spreading trends or sharing information. According to Granovetter, they might also help us fight back against the Man, or at least the redevelopment agency.

Read the whole post to get the full argument and definitely read Lehrer’s excellent review of Shirky’s book linked in the quotation above.  Essentially Lehrer is offering a kind of middle ground between Shirky and Gladwell.  Since I tend toward mediating positions myself, I think he makes a valid point; but I do lean toward Gladwell’s end of the spectrum nonetheless.

Here, however, is one more angle on the issue:  perhaps the factors working against the potential of social media are not only inherent in the form itself, but also a condition of society that predates the arrival of digital media by generations.  In The Human Condition, Hannah Arendt argued that power, the kind of power to transform society that Gladwell has in view,

. . . is actualized only where word and deed have not parted company, where words are  not empty and deeds not brutal, where words are not used to veil intentions but to disclose realities, and deeds are not used to violate and destroy but to establish relations and create new realities.

Arendt made that claim in the late 1950’s and she argued that even then words and deeds had been drifting apart for some time.  I suspect that since then the chasm has yawned ever wider and that social media participates in and reinforces that disjunction.  It would be unfair, however, to single out social media since the problem extends to most forms of public discourse, of which social media is but one example.

In The Disenchantment of Secular Discourse, Steven D. Smith argues that

It is hardly an exaggeration to say that the very point of ‘public reason’ is to keep the public discourse shallow – to keep it from drowning in the perilous depths of questions about ‘the nature of the universe,’ or ‘the end and object of life,’ or other tenets of our comprehensive doctrines.

If Smith is right — you can read Stanley Fish’s review in the NY Times to get more of a feel for his argument — social media already operate within a context in which the habits of public discourse have undermined our ability to take words seriously.  To put it another way, the assumptions shaping our public discourse encourage the divorce of words and deeds by stripping our language of its appeal to the deeper moral and metaphysical resources necessary to compel social action.  We tend to get stuck in the analysis and pseudo-debate without ever getting to action. As Fish puts it:

While secular discourse, in the form of statistical analyses, controlled experiments and rational decision-trees, can yield banks of data that can then be subdivided and refined in more ways than we can count, it cannot tell us what that data means or what to do with it . . . . Once the world is no longer assumed to be informed by some presiding meaning or spirit (associated either with a theology or an undoubted philosophical first principle) . . . there is no way, says Smith, to look at it and answer normative questions, questions like “what are we supposed to do?” and “at the behest of who or what are we to do it?”

Combine this with Kierkegaard’s 19th century observations about the Press that now appear all the more applicable to the digital world.  Consider the following summary of Kierkegaard’s fears offered by Hubert Dreyfus in his little book On the Internet:

. . . the new massive distribution of desituated information was making every sort of information immediately available to anyone, thereby producing a desituated, detached spectator.  Thus, the new power of the press to disseminate information to everyone in a nation led its readers to transcend their local, personal involvement . . . . Kierkegaard saw that the public sphere was destined to become a detached world in which everyone had an opinion about and commented on all public matters without needing any first-hand experience and without having or wanting any responsibility.

Kierkegaard suggested the following motto for the press:

Here men are demoralized in the shortest possible time on the largest possible scale, at the cheapest possible price.

I’ll let you decide whether or not that motto may be applied even more aptly to existing media conditions.  In any case, the situation Kierkegaard believed was created by the daily print press in his own day is at least a more likely possibility today.  A globally connected communications environment geared toward creating a constant, instantaneous, and indiscriminate flow of information, together with the assumptions of public discourse described by Smith, numbs us into docile indifference — an indifference social media may be powerless to overthrow, particularly when the stakes are high.  We are offered instead the illusion of action and involvement, the sense of participation in the debate.  But there is no meaningful debate, and by next week the issue, whatever the issue is, will still be there, and we’ll be busy discussing the next thing.  Meanwhile action walks further down a lonely path, long since parted from words.

Warning: A Liberal Education Leads to Independent Thinking

File this one under “Unintended Consequences.”

In the 1950’s, at the height of the Cold War, Bell Telephone Company of Pennsylvania put its most promising young managers through a rigorous 10-month training program in the Humanities with the help of the University of Pennsylvania.  During that time they participated in lectures and seminars, read voraciously, visited museums, attended the symphony, and toured Philadelphia, New York and Washington.  To top it off, many of the leading intellectuals of the time were brought in to lecture these privileged few and discuss their books.  Among the luminaries were poet W. H. Auden and sociologist David Reisman whose 1950 book, The Lonely Crowd, was a classic study of the set to which these men belonged.

The idea behind the program was simple.  Managers with only a technical background were competent at their present jobs, but they were not sufficiently well-rounded for the responsibilities of upper management.  As sociologist E. Digby Baltzell put it, “A well-trained man knows how to answer questions, they reasoned; an educated man knows what questions are worth asking.”  Already in the early 20th century “information overload” was deemed a serious problem for managers, but by the early 1950’s it was believed that computers were going to solve the problem. (I know.  That in itself is worth elaboration, but it will have to wait for another post.)  The automation associated with computers, however, ushered in a new problem — the danger that the manager would become a thoughtless, unoriginal, technically competent conformist.  Writing in 1961, Walter Buckingham warned against the possibility that automation would lead not only to a “standardization of products,” but also to a “standardization of thinking.”

But there were other worries as well.  It was feared that the Soviet Union was pulling away in the sheer numbers of scientists and engineers creating a talent gap between the USSR and America.  As a way of undercutting this advantage, many looked to the Humanities and a liberal education.  According to Thomas Woody, writing in 1950, “Liberal education was an education for free men, competent to fit them for freedom.”  Thus a humanistic education became not only a tool to better prepare business executives for the complexity of their jobs, it was a weapon against Communism.

In one sense, the program was a success.  The young men were reading more, their intellectual curiosity was heightened, and they were more open minded and able to see an argument from both sides.  There was one problem, however.  The Bell students were now less willing to be a cog in the corporate machinery.  Their priorities were reordered around family and community.  According to one participant, “Now things are different.  I still want to get along in the company, but I now realize that I owe something to myself, my family, and my community.”  Another put it this way,

Before this course, I was like a straw floating with the current down the stream.  The stream was the Bell Telephone Company.  I don’t think I will ever be like that straw again.

Consequently, the program began to appear as a threat to the company.  One other strike against the program:  a survey revealed that after passing through the program participants were likely to become more tolerant of socialism and less certain that a free democracy depended upon free business enterprise.  By 1960, the program was disbanded.

This is a fascinating story about the power of an education in the humanities to enlarge the mind and fit one for freedom.  But it is also a reminder that in an age of conformity, thinking for oneself is not always welcomed even if it is paid lip service.  After all, remember how well things turned out for Socrates.

__________________

A note about sources:  I first read about the Institute for Humanistic Studies for Executives in an op-ed piece by Wes Davis in the NY Times.  The story fascinated me and I subsequently found an article on the program written in the journal The Historian in 1998 by Mark D. Bowles titled, “The Organization Man Goes To College:  AT&T’s Experiment in Humanistic Education, 1953-1960.”  Quotes in the post are drawn from Bowles’ article.

“With Only Their Labor to Sell”

Glenn Beck drew a crowd and a good deal of commentary from across the political and religious spectrum.  I wasn’t at Beck’s “Restoring Honor” rally this past weekend and I haven’t spoken to any one who was, but I did come across a number of articles, editorials, and blog posts that offered their take on what was going on.  Needless to say, it wasn’t all positive.  But, as if to demonstrate that people remain more complex than our tendency to reduce the world into binary oppositions suggests, the most scathing review I read came from conservative, Southern Baptist seminary professor Russell Moore, and one of the more self-consciously open-minded pieces came from the LGBT Editor of Religion Dispatches, Alex McNeill.  Ross Douthat, who was at the rally, offered his take on the political implications of the “apolitical” rally in  his NY Times editorial and a follow-up blog post.

There is not much that I would care to add.  I’m basically in agreement with Moore, but rather preferred the sensibility to the actual people at the rally demonstrated by McNeil.  But thinking about the rally put me in mind to comment on a couple of other pieces I’d read within the last few days.  Beck (along with Limbaugh, Hannity, and company) tends to symbolize for many people the marriage of free market economics with cultural conservatism that came to dominate the political right from the late ’70’s through the present.  Essentially it is the Reagan coalition.  But what if that marriage was an inherently unstable mixture?

Sometime ago I was struck by a particular formulation offered by historian Eric Miller of Christopher Lasch’s critique of the both ends of the political spectrum.  According to Lasch, both ends harbored a fatal tension.  The Left called for socially conscious and active individuals while promoting a vision of the self that was atomized and unencumbered.  The Right called for the preservation of moral tradition and community while promoting an economic order that undermined those very institutions.  This remains, to my mind, a very apt summation of our current political situation.

In two recent essays, Philip Blond and Jonny Thakkar call for what they have respectively termed Red Toryism and Left Conservatism.  Neither Blond nor Thakkar cite Lasch, but they each channel Lasch’s analysis of the inner tension within modern conservatism’s attachment to free market ideology.

In “Shattered Society,” Blond, a London based academic turned political activist, laments the loss of mediating institutions which sheltered individuals from the power of the state and the market.

The loss of our culture is best understood as the disappearance of civil society. Only two powers remain: the state and the market. We no longer have, in any effective independent way, local government, churches, trade unions, cooperative societies, or civic organizations that operate on the basis of more than single issues. In the past, these institutions were a means for ordinary people to exercise power. Now mutual communities have been replaced with passive, fragmented individuals.

And according to Blond, “Neither Left nor Right can offer an answer because both ideologies have collapsed as both have become the same.”  The left lives by an “agenda of cultural libertarianism” while the right espouses an agenda of “economic libertarianism,” and there is, in Blond’s view, little or no difference between them.  They have both contributed to a shattered society.  “A vast body of citizens,” Blond argues, “has been stripped of its culture by the Left and its capital by the Right, and in such nakedness they enter the trading floor of life with only their labor to sell.”

In the provocatively titled, “Why Conservatives Should Read Marx,” Thakkar argues that there is no compelling reason for conservatives to wed themselves to free market ideology.  He cites Samuel Huntington who described conservatism as a “‘situational’ ideology which necessarily varies from place to place and time to time …” “The essence of conservatism,” Huntington believed, “is the passionate affirmation of the values of existing institutions.”

Following anthropologist Arnold Gehlen, Thakkar assumes that habits and routines and the cultural institutions that support them are necessary for human flourishing.  These culturally inculcated habits and routines function as instincts do for other animals.  Apart from them we would be “prone to unbearable cognitive overload.”  A predicament that is all the more palpable at present than when Gehlen wrote in the middle of the last century.

But following Marx, Thakkar believes that it is in the nature of capitalism to undermine existing social and cultural institutions.  The reason is simple.  Competition necessarily drives technological innovation (not necessarily a bad thing of course!), and technological innovation in the realm of economic production elicits social change as well.  “To the degree that technological change is built into capitalism,” Thakkar summarizes, “so must institutional change be.  In every single generation certain institutions will become obsolete, and with them their attendant practices and values.”

Whatever one may think about the merits of this process, it certainly isn’t inherently conservative.  As Thakkar writes further on, “In theory it is possible to be an economic libertarian and a social conservative; in practice the two are irreconcilable.”

You can read both pieces to get the whole of their respective arguments as well as their proposals for moving forward.  Neither Thakkar nor Blond claim to be against the free market, but they are both in favor of re-prioritizing the health of society, particularly its mediating institutions.  In Blond’s view, this can lead to a “popular capitalism” that entails “a market economy of widely disbursed property, of multiple centers of innovation, of the decentralization of capital, wealth, and power.”

For Thakkar, this means pursuing a “commitment to think each case through on its own merits:  if something is harmful or unjust, we should try to change it; but if something valuable is being destroyed, we should try to conserve it,” rather than blindly submitting to the demands of the growth economy.

Whether we agree with the details of their policy suggestions or not, it seems to me that both Thakkar and Blond, like Lasch before them, have perceptively diagnosed the inner tensions of the political right (and left) and the cultural consequences of those tensions.