Oral Social Literacy, Past and Present

Ivan Illich’s In the Vineyard of the Text is an exploration of the evolution of reading and the book in Western Europe during the 13th century. It is focused on Hugh of St. Victor and a well-known work of his titled the Didascalicon, essentially the first guide to the art of reading.

Illich notes, “Before Hugh’s generation, the book is a record of the author’s speech or dictation. After Hugh, increasingly it becomes a repertory of the author’s thought, a screen onto which one projects still unvoiced intentions.”

Illich early on acknowledges his debt to Walter Ong who synthesized a great deal of research on the cultural consequences of the shift from orality to literacy (and later to what Ong called the secondary orality of electronic media).

What is sometimes lost in this schema is the persistence of orality after the emergence of literacy. And not only in the sense that oral cultures existed alongside literate ones, but also in the persistence of orality within literate societies.

A full 1500 years after literacy was effectively internalized into Western society (which is not the same thing as saying that all of those living in Western society were literate), reading remained a fundamentally oral activity. The quotation from Illich above is drawn from a chapter titled, “Recorded Speech to Record of Thought.” That title nicely captures the degree to which writing was understood as a record of oral communication rather than its own distinct medium prior to the period Illich examined.

Here is Illich again on the orality (and corporeality) of literacy through the late medieval period:

“In a tradition of one and a half millennia, the sounding pages are echoed by the resonance  of the moving lips and tongue.  The reader’s ears pay attention, and strain to catch what the reader’s mouth gives forth.  In this manner the sequence of letters translates directly into body movements and patterns nerve impulses.  The lines are a sound track picked up by the mouth and voiced by the reader for his own ear.  By reading, the page is literally embodied, incorporated.”

And here again on the oral and social nature of reading:

“The monastic reader — chanter or mumbler — picks the words from the lines and creates a public social auditory ambience. All those who, with the reader, are immersed in this hearing milieu are equals before the sound … Fifty years after Hugh, typically, this was no longer true. The technical activity of deciphering no longer creates an auditory and, therefore, a social space. The reader then flips through pages. His eyes mirror the two-dimensional page. Soon he will conceive of his own mind in analogy with a manuscript. Reading will become an individualistic activity, intercourse between a self and a page.”

As I read these passages again today, I was reminded of an essay that appeared not too long ago in the Wall Street Journal. In “Is This the Future of Punctuation?”, Henry Hitchings, the author of The Language Wars: A History of Proper English, makes the following observation about new proposed punctuations marks such as the  interrobang:

“Such marks are symptoms of an increasing tendency to punctuate for rhetorical rather than grammatical effect. Instead of presenting syntactical and logical relationships, punctuation reproduces the patterns of speech.”

The emergence of telephony, radio, and television marked the re-emergence of orality following the era of print literacy’s dominance. Ong called this secondary orality. Having appeared after literacy, it was not identical to primary orality, but it nonetheless represented a reemergence of orality and its habits which would now compete with literacy on the cultural stage.

Within the last twenty years, however, a funny thing has happened on the way to the world of secondary orality. Writing or text has reasserted itself. Text messaging, emails, online reading, e-reading, etc. — all of these together mean that most of us are deciphering of a lot of text each day. Even our television screens, depending on what we are watching, may be chock full of text, scrolling or otherwise.

But this reemergence of text is marked by orality, as the observation by Hitchings suggests. Can we call this secondary literacy? Is it still useful to speak in terms of literacy and orality?

It has always seemed to me that the orality/literacy distinction got at important historical developments in communication and consciousness. The bare dichotomy glossed over a good deal and it was always in need of qualification, but it was serviceable nonetheless. Secondary orality also pointed to important developments. But now what we appear to have, text having reasserted itself, is a thoroughly blended media environment.

It’s chief characteristic is neither its orality nor its literacy. Rather, it is the preponderance of both together — overlapping, interpenetrating, jostling, complementing, conflating.

Interestingly, there has also been, of course, a reemergence of social literacy, but it is not tied to the oral as it was in the circumstances described by Illich above. Rather than an orally constituted literate social anchored to physical presence, we have a diffused literacy based, image-inflected social often untethered from physical presence.

A social space was, then, constituted by an oral performance of the written text and gathered presences. We have, today, spaces constituted as social by silent reading and the presence of absences.

File all of this under “thinking aloud.” (Except, of course, that it wasn’t!)

“I Was Born To Stand Outside Myself”

Last summer I posted a few thoughts on Mark Helprin’s A Soldier of the Great War which I had then begun reading. As it turned out, I set the book aside for a few months for no particular reason, but I’ve recently returned to it and finished it. I stand by my initial characterization: the novel is a meditation on beauty, and a lovely one at that.

Near the end, the main character, Alessandro, meets up with a talented, intelligent, articulate man named Arturo who is nonetheless thoroughly unsuccessful and unaccomplished. In conversation with Alessandro, Arturo sums up his life thus: “I was born to stand outside myself.”

This struck me as a remarkably evocative line. Now I confess that I’m going to take this in a direction that Helprin neither intended nor imagined. That said, the line immediately suggested to me the manner in which we are made to virtually stand beside ourselves in our (social) media environment.

There are perhaps two senses in which this is true. I’m thinking of the way that our social media profiles stand beside us in a rather apprehensible manner. We can look at them, manipulate them, experience them; they are us but not us. In this sense, I’m still inclined to think of our social media profiles as virtual memory theaters, at least in part.

The other sense is the manner in which the presence of our second self impinges upon our lived experience. We not only stand beside our past self as it is represented online, we also stand beside our potential future self as it will be represented online. This other potential self haunts our present from the future. Several months ago I posted a synopsis of Michel de Certeau’s observations about the way the past layers over certain places and haunts them. Places capture memories and those places cannot be divorced from those memories in lived experience. Perhaps if he were alive today, de Certeau would suggest that the future as well as the past now haunts our present. We live with our virtual memory theaters and we live with future memories as well. We are born to stand beside ourselves, actual and potential. And, perhaps most significantly, the potential self we live with is imagined within the constraints of the platform through which it will represented.

On this latter point, I would also point you to Nathan Jurgenson’s fine essay reprinted in The Atlantic: “The Facebook Eye.”

It does occur to me that this standing beside ourselves did not itself emerge with the advent of social media. One could argue that it is a function of all representation. Certainly it is a feature of writing itself. Walter Ong made much of the way in which literacy alienates. Among the many separations effected by writing Ong includes the separation of the known from the knower, the past from the present, and being from time.

So we might conclude that social media only augments a long standing trajectory. But my sense, as it often is with efforts to situate present phenomena within a historical trajectory, is that this threatens to miss the significance of present developments. Changes of degree may amount to changes in kind. As I’ve heard it put somewhere or other, a hurricane is not merely an unusually strong breeze.

Social media, and the technological ecosystem on which it depends, radically augments the alienation of writing if only by its mere ubiquity. But perhaps more importantly, it does so by making our second self that stands beside us also a public self that presents itself to a myriad of others. It’s the difference between an old-school diary and your Facebook profile. It is no small difference and we are all in the process of sorting out the personal and social consequences.

De Certeau concluded that “haunted places are the only ones people can live in.” Walter Ong was also sanguine about the alienations wrought by writing:

“To say writing is artificial is not to condemn it but to praise it . . . By distancing thought, alienating it from its original habitat in sounded words, writing raises consciousness.  Alienation from a natural milieu can be good for us and indeed is in many ways essential for fuller human life.  To live and to understand fully, we need not only proximity but also distance.  This writing provides for, thereby accelerating the evolution of consciousness as nothing else before it does.”

Now the question is whether the standing beside ourselves effected by social media will carry similarly salubrious consequences. Discuss amongst yourselves. Here’s the prompt to get you going: Has social media continued to accelerate the evolution of consciousness or self-consciousness? Is there a difference?

All the World’s a Fair

A few weeks ago I wrote a post or two which mentioned the 1939 New York World’s Fair. The ’39 Fair in particular caught my attention as a remarkable fusion of technology and modernism in the service of a utopian vision of the future. But the ’39 Fair is only one of many Fairs and Expositions held in the US and around the world since 1851, the year of London’s Crystal Palace Exhibition.

It’s quite likely that you’ll be hearing a bit more about these fairs, particularly the ’39 Fair in the coming weeks. They are fascinating historical snapshots capturing at once the past, present, and hoped for future. Many of the fairs included a retrospective look back at the culture’s achievement. For example, the 1876 Philadelphia Fair explicitly remembered the first 100 years of American history following the Declaration of Independence. The fairs also look forward to the future. This was most obviously the case in the ’39 Fair that took “The World of Tomorrow” as its theme, but it is an element of all the fairs. And, of course, in the way they remembered the past and the way they envisioned the future the fairs were perhaps above all else leading indicators of their own time.

Historian Robert W. Rydell has made a career out of telling the story of the World’s Fairs and Expositions, especially those held in America. In the Introduction to All the World’s a Fair, Rydell provides a useful frame by which we might approach the significance of the Fairs.

Following sociologists Peter Berger and Thomas Luckman, Rydell suggests that we understand the Fairs as “symbolic universes.” In their view, a symbolic universe placed “all collective events in a cohesive unity that includes past, present, and future. With regard to the future, it establishes a common frame of reference for the projection of individual actions.”

It is also interesting to consider the fairs as quasi-relgious experiences. Rydell notes Henry Adams suggestive claim that he “professed the religion of the World’s Fairs.” Interestingly, there is an etymological basis to the comparison: we “fair” derive from the Latin feria which means “holy day” and the German Messe suggests both “mass” and “fair.” More importantly, as Rydell puts it,

“America’s world’s fairs resembled religious celebrations in their emphasis on symbols and ritualistic behavior. They provided visitors with a galaxy of symbols that cohered as ‘symbolic universes.’ These constellations, in turn, ritualistically affirmed fairgoer’s faith in American institutions and social organization, evoked a community of shared experience, and formulated responses to questions about the ultimate destiny of mankind in general and of Americans in particular.”

The fairs were also consciously arranged around the theme of progress. “Expositions are timekeepers of progress,” President William McKinley famously proclaimed. “They record the world’s advancement. They stimulate the energy, enterprise, and intellect of the people and quicken human genius.” (McKinley, incidentally, was killed at a World’s Fair, the 1901 Pan-American Exposition in Buffalo.) In Rydell’s summary, the “mythopoeic grandeur” of the fairs lay in their translation of “an ideology of economic development, labeled ‘progress,'” into “a utopian statement about the future.”

Perhaps not surprisingly, many of the fairs were held during times of severe economic and social strain. The ’39 Fair, coming years after the Great Depression had been in full swing, is only the most obvious example. In fact, the most striking feature of the present period of economic  difficulty considered in light of the past 150 years may be the absence of a World’s Fair. I suspect that we are now too knowing and ironically self-aware to take something like a World’s Fair seriously. The mythic aspect of the fairs has been significantly paired down into a theme park experience. Consider Disney’s Carousel of Progress the hinge. It debuted at the 1964 New York World’s Fair before making Disney Land its home. Subsequently, EPCOT has functioned as a kind of permanent World’s Fair.

Finally, one more item for all of those interested in a bit of cultural archeology. Below is a 50-minute film prepared by the Westinghouse Corporation for the 1939 Fair. The film is fascinating at a number of levels. I’ll leave it to you to watch the whole, but here’s a very brief synopsis. The Middletons are an average American family who have come to New York to visit the Fair. The daughter, who had been living in New York, is now involved with a socialist, anti-American art professor. At the fair, she runs into her old flame, a clean cut, All-American engineer working for Westinghouse at the Fair and determined to the defend the American way of life. Enjoy.

_____________________________

See also: The World of Tomorrow, Inc.

On the Passing of Old Trees

Within driving distance of where I live stood one of the oldest living trees in the world, until today.

“The Senator,” as the tree was known, held its ground for an estimated 3,500 years. Monday morning it caught fire and collapsed. The cause of the fire is unknown.

This was terribly sad news. How odd to consider that something that lasted for so long and endured so much came to its end in one’s own lifetime.

But how do we begin to asses this sort of loss? It points out the woeful inadequacy of our preference for quantifiable measures, none of which could possibly account for the passing of old trees.

One measure of the tree’s significance suggested itself to me when a local station interviewed a woman who came to the fallen tree this morning with an old black and white photograph. Coming to the tree had been a part of her family’s history. The tree marked time for her. In one sense, it marks time for all of us.

America is not a land of ruins that might engrave in our imagination a feeling for the depth of history. There is very little by which we might take the measure of our lives, and less still that might suggest to us the ephemeral nature of the days with which we have been gifted  and to discourage us from adopting the pretensions of presumed timelessness.

This tree, when it was looked upon and thought of, did just that.

Ursula Le Guin once wrote of one of her characters, “he believed that the wise man is one who never sets himself apart from other living things, whether they have speech or not, and in later years he strove long to learn what can be learned, in silence, from the eyes of animals, the flight of birds, the great slow gestures of trees.”

That seems just about right.

Photo by Anthony Scotti via Wikimedia Commons

Hole In Our Hearts

Writing for Gizmodo, Matt Honan describes his experience at the massive Consumer Electronics Show in Las Vegas, and it reads like a passage from Augustine’s Confessions had Augustine been writing in the 21st rather than 5th century.

The quasi-religious overtones begin early on when Honan tells us, “There was ennui upon ennui upon ennui set in this amazing temple to technology.”

Then, a little further on, Honan writes:

“There is a hole in my heart dug deep by advertising and envy and a desire to see a thing that is new and different and beautiful. A place within me that is empty, and that I want to fill it up. The hole makes me think electronics can help. And of course, they can.

They make the world easier and more enjoyable. They boost productivity and provide entertainment and information and sometimes even status. At least for a while. At least until they are obsolete. At least until they are garbage.

Electronics are our talismans that ward off the spiritual vacuum of modernity; gilt in Gorilla Glass and cadmium. And in them we find entertainment in lieu of happiness, and exchanges in lieu of actual connections.

And, oh, I am guilty. I am guilty. I am guilty.

I feel that way too. More than most, probably. I’m forever wanting something new. Something I’ve never seen before, that no one else has. Something that will be both an extension and expression of my person. Something that will take me away from the world I actually live in and let me immerse myself in another. Something that will let me see more details, take better pictures, do more at once, work smarter, run faster, live longer.”

Here is the confession, the thrice-repeated mea culpa, alongside a truly Augustinian account of our disordered attachments and loves complete with a Pascalian nod to the diversionary nature of our engagement with technology.

I call this an Augustinian account not only because of the religiously inflected language and the confessional tone. There is also the explicit frame of an unfulfilling quest to fill a primordial emptiness. Augustine’s Confesssions amounts to a retrospective narrative of the spiritual quest which takes him from dissatisfaction to dissatisfaction until it culminates in his conversion. He famously frames his narrative at the outset when he prays, “You have made us for yourself, and our hearts are restless till they find their rest in you.” The restless heart knows its own emptiness and seeks, often heroically and tragically, to fill it. It loves and seeks to be loved, but it loves all the wrong things.

Pascal, writing in the shadow of Augustine’s influence, put it thus:

“What else does this craving, and this helplessness, proclaim but that there was once in man a true happiness, of which all that now remains is the empty print and trace? This he tries in vain to fill with everything around him, seeking in things that are not there the help he cannot find in those that are, though none can help, since this infinite abyss can be filled only with an infinite and immutable object; in other words by God himself.”

In a post titled “Making Holes In Our Hearts,” Kevin Kelly agrees to a point with Honan’s diagnosis, but his interpretation is quite different and also worth quoting at length. Here is Kelly:

If we are honest, we must admit that one aspect of the technium is to make holes in our heart. One day recently we decided that we cannot live another day unless we have a smart phone, when a dozen years earlier this need would have dumbfounded us. Now we get angry if the network is slow, but before, when we were innocent, we had no thoughts of the network at all. Now we crave the instant connection of friends, whereas before we were content with weekly, or daily, connections. But we keep inventing new things that make new desires, new longings, new wants, new holes that must be filled.

Yes, this is what technology does to us. Some people are furious that our hearts are pierced this way by the things we make. They see this ever-neediness as a debasement, a lowering of human nobility, the source of our continuous discontentment. I agree that it is the source. New technology forces us to be always chasing the new, which is always disappearing under the next new, a salvation always receding from our grasp.

But I celebrate the never-ending discontentment that the technium brings. Most of what we like about being human is invented. We are different from our animal ancestors in that we are not content to merely survive, but have been incredibly busy making up new itches which we have to scratch, digging extra holes that we have to fill, creating new desires we’ve never had before.

Kelly is on to something here. Discontentment is generative. Dissatisfaction can be productive. When Cain, having murdered his brother, is cursed to be forever a wanderer alienated from God and family, he builds a city in response. Here is an allegory to match Kelly’s observation. The perpetually wandering, alienated heart builds and makes and creates.

But does it follow that we should then celebrate discontentment, dissatisfaction, and unhappiness? I don’t see how. It is hard to cheer on misery, and it is a certain misery that we are talking about here. Perhaps the more appropriate response is the kind of plaintive admiration we reserve for the tragic hero. They may posses a real nobility, but it is finally consumed in despair and destruction.

The narrator of Cain’s story tells us that he built his city in the land called Nod, a name that echoes the Hebrew word for “wandering.” This touch of literary artistry poignantly suggests that even surrounded by the accouterments of civilization the human soul wanders lost and alienated – never satisfied, never home, never secure.

There is at least one other reason why we need not celebrate generative misery. Misery is not the only fount of human creativity. Love, wonderment, compassion, kindness, curiosity, beauty — all of these might also set us to work and marvelously so.

Augustine understood that finding rest for his restless heart in the love of God did not necessarily extinguish all other loves. It merely reordered them. Having found the kind of satisfaction and happiness that our stuff (for lack of a more inclusive word) can never bring does not mean that we can never again create or enjoy the fruits of human creativity. In fact, it likely means that we may create and enjoy more fully because such creation and enjoyment will not be burden with the unbearable weight of filling the primordial vacuum of the human heart.

The simplest and only way to enjoy penultimate and impermanent things is to resist the temptation to invest them with the significance and adoration that only ultimate and permanent things can sustain.

Saint Augustine by Phillippe de Champaigne, c. 1645