“In the Vineyard of the Text”

 

That is the title of a slender volume authored by historian and social critic Ivan Illich in 1993 and subtitled, A Commentary to Hugh’s Didascalicon.  The Hugh in question is Hugh of St. Victor, a medieval theologian, philosopher, and mystic who wrote the Didascalicon around 1128.  According to Illich, Hugh’s Didascalicon was the “first book written on the art of reading” and  Illich believes that revisiting this 12th century work will help us better understand our ongoing transition from “bookish” reading to whatever multiple forms of reading have been emerging in the wake of the electronic or digital revolution.

In the spirit of the commonplace book, I’m going to transcribe some excerpts from In the Vineyard of the Text below and in subsequent posts over the next several days.  Realizing that there are countless books I would like to read and for a variety of reasons will simply not ever get to, I appreciate it when I can nonetheless get some gleanings from one of those books through a review or post that excerpts some of the key ideas worth considering.  Hardly the best form of reading, but nonetheless a useful one I think.  So in the hope that this will prove useful and because I do think Illich wrote a remarkably insightful little book, here are some selections from the Introduction:

  • “I concentrate my attention on a fleeting but very important moment in the history of the alphabet when, after centuries of Christian reading, the page was suddenly transformed from a score for pious mumblers into an optically organized text for logical thinkers.  After this date a new kind of classical reading became the dominant metaphor for the highest form of social activity.”
  • “Quite recently reading-as-a-metaphor has been broken again.  The picture and its caption, the comic book, the table, box, and graph, photographs, outlines, and integration with other media demand from the user textbook habits which are contrary to those cultivated in scholastic readerships.”
  • Quoting George Steiner:  “The development of the modern book and of book-culture as we know it seems to have depended on a comparable fragility of crucial and interlocking factors.”
  • “Classical print culture was an ephemeral phenomenon.  According to Steiner, to belong to ‘the age of the book’ meant to own the means of reading.  The book was a domestic object; it was accessible at will for re-reading.  The age presupposed private space and the recognition of the right to periods of silence, as well as the existence of echo-chambers such as journals, academies, or coffee circles.  Book culture required a more or less agreed-upon canon of textual values and modes . . . . [T]he formalities involved in this one kind of reading defined, and did not just reflect, the dimensions of social topology.”
  • “The book has now ceased to be the root-metaphor of the age; the screen has taken its place.  The alphabetic text has become but one of many modes of encoding something, now called ‘the message.'”
  • “Bookish reading can now clearly be recognized as an epochal phenomenon and not as a logically necessary step in the progress toward the rational use of the alphabet; as one mode of interaction with the written page among several; as a particular vocation among many, to be cultivated by some, leaving other modes to others.”
  • “. . . in the first six chapters I describe and interpret a technical breakthrough which took place  around 1150, three hundred years before movable type came into use.  This breakthrough consisted in the combination of more than a dozen technical inventions and arrangements through which the page was transformed from score to text.  Not printing, as is frequently assumed, but this bundle of innovations, twelve generations earlier, is the necessary foundation for all stages through which bookish culture has gone since.  This collection of techniques and habits made it possible to imagine the ‘text’ as something detached from the physical reality of  a page.  It both reflected and in turn conditioned a revolution in what learned people did when they read — and what they experienced reading to mean.”
  • “By centering our analysis on the object that is shaped by letters, and on the habits and fantasies connected with its use, we turn this object into a mirror reflecting significant transformations in the mental shape of western societies, something not easily brought out by other approaches.”
  • “What had started as a study in the history of technology, ended up as a new insight into the history of the heart.  We came to understand Hugh’s ars legendi [art of reading] as an ascetic discipline focused by a technical object.  Our meditation on the survival of this mode of reading under the aegis of the bookish text led us to enter upon a historical study of an asceticism that faces the threat of computer ‘literacy.'”

The Introduction as a whole challenges us to think again about the activity of reading.  We assume reading to be a singular activity, we assume it is always done in the same way and for the same reasons, we do not think of the alphabet as a technology that can be deployed in multiple modes, we do not think of a whole cultural milieu depending on something so mundane as reading, we do not think that changes in reading habits and assumptions about reading could reorder society.  By taking us back some 900 years Illich aims to show us that these are mistaken assumptions.

More to come.

Jacques Lacan, Jansenist?

If I imagine a Venn diagram consisting of one circle representing those interested in Jacques Lacan (a modest circle), and another representing those who read this blog (a rather tiny circle), then the overlapping area probably includes one person … if I count myself.  Nonetheless, I’ll post this anyway.

In conversation with a friend I was made aware of an article that contains this intriguing anecdote (if you’re in that overlapping area in the Venn diagram):

Jan Miel was, he says, the first to propose translating a text of Lacan’s into English and as a result had been invited to lunch in his country house in Guirrancourt, not far from Paris.  After the meal during a stroll in the garden Lacan turned to him and said:  ‘You are neither an analyst nor an analysand, so why are you interested in my teaching?’.  Miel found it difficult to answer because, he admits, he really did not know what he found so fascinating in Lacan’s work, so he eventually stammered:  ‘Well, my main interest is in Pascal.’  To which Lacan replied, ‘Ah, I understand’ and led him back to his library where he showed him a quite substantial collection of Jansenist books.  So if reading Lacan leads to Pascal, it appears that reading Pascal may also lead to Lacan.

“Ah, I understand” — loved that, and wondered how many times those same words were uttered in a Lacan seminar!

The article goes on to explore the use Lacan makes of Pascal’s Wager and presents some helpful background material on the Wager.  Be warned though, some math is involved.

“The Past Is Never Dead”

In the wake of the Protestant Reformation, religious violence tore across Europe.  The Wars of Religion, culminating with the Thirty Years’ War, left the continent scarred and exhausted.  Out of ashes of war the secular nation state arose to establish a new political order which privatized religion and enshrined reason and tolerance as the currency of the public sphere ensuring an end to irrational violence.

That is one of the more familiar historical narratives that we tell ourselves.  It is sweeping and elegant in its scope and compelling in its explanatory power.  There’s only one problem according to William Cavanaugh:  it’s not true.  Cavanaugh lays out his case in his most recent book, The Myth of Religious Violence: Secular Ideology and the Roots of Modern Conflict (Oxford UP, 2009).  Needless to say, he has his work cut out for him.  The narrative he seeks to deconstruct is deeply entrenched and we’ve staked a lot on it.  His point, to be clear, is not that religion has never been implicated in violence.  As he puts it elsewhere, “Given certain conditions, Christianity, Islam, and other faiths can and do contribute to violence.”  Rather, he is contesting the particular historical narrative whereby the secular nation state arises in response to religious violence in order to secure peace for society by marginalizing religious practice and discourse.

To begin with, Cavanaugh demonstrates that the very concept of religion is problematic and thus renders any neat parsing of violence into either religious or secular categories tenuous at best.  Moreover, the nation state precedes the so called wars of religion and is best seen as a contributing cause, not an effect of the wars.  The historical realities of the wars resist the simplistic “Wars of Religion” schema anyway.  For example, during the  Thirty Years’ War, Catholics were at times fighting other Catholics and sometimes in league with Protestants.  The Thirty Years’ War as it turns out was a scramble by competing dynasties and rising national governments to fill the power vacuum created by the collapse of the medieval political order.  Furthermore, Cavanaugh suggests that the state co-opted and assumed for itself many of  the qualities of the church creating, as one reviewer put it, “its own sacred space, with its own rituals, hymns, and theology, and its own universal mission.”  In the end, the secular nation state, particularly in its 20th century totalitarian apotheosis, hardly appears as the champion of reason, peace, and tolerance.  The nation state secured its status by monopolizing the use of coercive force.  In doing so, however, it clearly did not put an end to violence.

Cavanaugh presents a counter-intuitive thesis and he takes care to make his case.  It is a case he has been working out since 1995 when, as a graduate student, he published “‘A fire strong enough to consume the house:’ The Wars of Religion and the Rise of the Nation State.” In the intervening years he has honed and strengthened his argument which finds mature expression in the The Myth of Religious Violence.

Whether one ultimately agrees with Cavanaugh’s thesis or not, his work highlights two important considerations regarding historical narratives.  First, historical reality is usually more complex than the stories we tell, and the complexity matters.  We are living through a cultural moment when historical awareness is a rare commodity, so perhaps we shouldn’t complain too much about shallow historical knowledge when the alternative may be no historical knowledge.  But that said, much of what does pass for historical knowledge too frequently is filtered through Hollywood, the entertainment industry, or the talk-show circuit, and for all these subtlety must necessarily be sacrificed to the demands of the medium.  The big picture sometimes is painted at the expense of important details, so much so that the big picture is rendered misleading.

Perhaps most days of the week, this is not a terribly important consideration.  But it can become very significant under certain circumstances.  When a historical narrative is hotly contested and passionately defended it is usually because the real battle is over the present.  Consider heated debates about the Christian or secular origins of the American constitutional order, or arguments over the causes of the American Civil War and Southern identity.  Leaving the terrain of American history, consider the Armenian genocide, the Japanese atrocities at Nanking, or the the tangled history of the Balkans.  In each case the real issue is clearly not the accuracy of our historical memory so much as it is the perceived implications for the present.  In other words, we fight for our vision for the present on the battlefield of the past.  This raises a host of other questions related to the status of arguments from history, philosophies of history, and historiography.  These sorts of questions, however, are rarely raised at rallies or on television — it would be hard to fit them on a placard or in a 10 second sound bite.

A debate about the origins of the modern nation state is likewise about more than historical accuracy.  Critics of religion and the place of religion in public life, Christopher Hitchens and Sam Harris for example, have made the historical narrative we began with a key component of their case — religion kills, the secular state saves.  Cavanaugh has offered the compelling rejoinder.

Either way it appears Faulkner was right:  “The past is never dead.  It’s not even past.”

_____

You can listen to a lecture and link to a number of essays by Cavanaugh here.

Clever Blog Post Title

Reading about Jon Stewart’s recent D. C. rally, I noticed a picture of a participant wearing a “V for Vendetta” mask and carrying a sign.  It wasn’t the mask, but the words on the sign that caught my eye.  The sign read, “ANGRY SIGN.”

I wouldn’t have paid much attention to it had I not recently crossed paths with a guy wearing a T-shirt that read:  “Words on a T-shirt.”  Both of these then reminded me of the very entertaining, “Academy Award Winning Trailer” — a self-referential spoof of movie trailers embedded at the end of this post.  The “trailer” opens with a character saying, “A toast — establishing me as the wealthy, successful protagonist.”  And on it goes in that vein.  The comments section on the clip’s Youtube page likewise yielded a self-referential parody of typical Youtube comments.  Example:  “Quoting what I think is the funniest part of the video and adding “LOL” at the end of the comment.”

There is a long tradition of self-reference in the art world, Magritte’s pipe is classic example, but now it seems the self-referentiality that was a mark of the avant-garde is an established feature of popular culture.  Call it vulgar self-referentiality if you like.  Our sophistication, in fact, seems to be measured by the degree of our reflexivity and self-awareness — “I know, that I know, that I know, etc.” — which elides with a mood of ironic detachment.   Earnestness in this environment becomes a vice.  We are in a sense “mediating” or re-presenting our own selves through this layer of reflexivity, and we’re pretty sure everyone else is too.

On the surface perhaps, there is a certain affinity with the Socratic injunction, “Know thyself.”  But this is meta-Socrates, “Knowingly know thyself.”  At issue for some is whether there is  a subject there to know that would localize and contain the knowing, or whether in the absence of a subject all there is to know is a process of knowing, knowing itself.

Leaning on Thomas de Zengotita’s Mediated: How the Media Shapes Your World and the Way You Live in It and Hannah Arendt’s The Human Condition, here’s a grossly oversimplified thumbnail genealogy of our heightened self-consciousness.  On the heels of the Copernican revolution, we lost confidence in the ability of our senses to apprehend the world truthfully (my eyes tell me the world is stationary and the sun moves, turns out my eyes lied), but following Descartes we sought certainty in our inner subjective processes — “I think, therefore I am” and all that.  Philosophically then our attention turned to our own thinking — from the object out there, to the subject in here.

Sociologically, modernity has been marked by an erosion of cultural givens; very little attains a taken-for-granted status relative to the pre-modern and early modern world.  In contrast to the medieval peasant, consider how much of your life is not pre-determined by cultural necessity.  Modernity then is marked by the expansion of choice, and choice ultimately foregrounds the act of choosing which yields an attendant lack of certainty in the choice – I am always aware that I could have chosen otherwise.  In other words, identity grounded in the objective (reified) structures of traditional society is superseded by an identity which is the aggregate of the choices we make — choice stands in for fate, nature, providence, and all the rest.  Eventually an awareness of this process throws even the notion of the self into question; I could have chosen otherwise, thus I could be otherwise.  The self, as the story goes, is decentered.  And whether or not that is really the case, it certainly seems to be what we feel to be the case.

So the self-referentialism that marked the avant-garde and the theories of social constructivism that were controversial a generation ago are by now old news.  They are widely accepted by most people under 35 who didn’t pick it all up by visiting the art houses or reading Derrida and company, but by living with and through the material conditions of their society.

First the camera, then the sound recorder, the video recorder, and finally the digitization and consequent democratization of the means of production and distribution (cheap cameras/Youtube, etc.) created a society in which we all know that we know that we are enacting multiple roles and that no action yields a window to the self, if by self we mean some essential, unchanging core of identity.  Foucault’s surveillance society has produced a generation of method actors. In de Zengotita’s words,

Whatever the particulars, to the extent that you are mediated, your personality becomes an extensive and adaptable tool kit of postures . . . You become an elaborate apparatus of evolving shtick that you deploy improvisationally as circumstances warrant.  Because it is all so habitual, and because you are so busy, you can almost forget the underlying reflexivity.

Almost.

There seems to be a tragically hip quality to this kind of hyper-reflexivity, but it is also like looking into a mirror image of a mirror — we get mired in an infinite regress of our own consciousness.  We risk self-referential paralysis, individually and culturally, and we experience a perpetual crisis of identity.  My sense is that a good deal of our cynicism and apathy is also tied into these dynamics.  Not sure where we go from here, but this seems to be where we are.

Or maybe this is all just me.

“The storm is what we call progress”

Via Alan Jacobs at Text Patterns, I read the following excerpt from Arikia Millikan’s short piece “I Am a Cyborg and I Want My Google Implant Already” on The Atlantic’s web site:

By the time I finished elementary school, writing letters to communicate across great distances was an archaic practice. When I graduated middle school, pirating music on Napster was the norm; to purchase was a fool’s errand. At the beginning of high school, it still may have been standard practice to manually look up the answer to a burning question (or simply be content without knowing the answer). Internet connection speeds and search algorithms improved steadily over the next four years such that when I graduated in the class of 2004, having to wait longer than a minute to retrieve an answer was an unbearable annoyance and only happened on road trips or nature walks. The summer before my freshman year of college was the year the Facebook was released to a select 15 universities, and almost every single relationship formed in the subsequent four years was prefaced by a flood of intimate personal information.

Now, I am always connected to the Web. The rare exceptions to the rule cause excruciating anxiety. I work online. I play online. I have sex online. I sleep with my smartphone at the foot of my bed and wake up every few hours to check my email in my sleep (something I like to call dreamailing).

But it’s not enough connectivity. I crave an existence where batteries never die, wireless connections never fail, and the time between asking a question and having the answer is approximately zero. If I could be jacked in at every waking hour of the day, I would, and I think a lot of my peers would do the same. So Hal, please hurry up with that Google implant. We’re getting antsy.

Well, hard to beat honesty I suppose.  I did find it slightly ironic that the Google executive who is interviewed for this piece was named Hal.

Jacobs aptly titled his post “The saddest thing I have read in some time,” and he added simply, “There’s a name for this condition: Stockholm Syndrome.”  Well put, of course.

Perhaps it was reading that piece that prepared me to read Walter Benjamin’s IX Thesis on the Philosophy of History later on that day with a certain melancholy resonance:

A Klee painting named “Angelus Novus” shows an angel looking as though he is about to move away from something he is fixedly contemplating.  His eyes are staring, his mouth is open, his wings are spread.  This is how one pictures the angel of history.  His face is turned toward the past.  Where we perceive a chain of events, he sees one single catastrophe  which keeps piling wreckage upon wreckage and hurls it in front of his feet.  The angel would like to stay, awaken the dead, and make whole what has been smashed.  But a storm is blowing from Paradise; it has got caught in his wings with such violence that the angel can no longer close them.  This storm irresistibly propels him into the future to which his back is turned, while the pile of debris before him grows skyward.  The storm is what we call progress.

In any case, I tend to agree with Jacobs — it was rather sad.