Postman On Media, Politics, and Childhood

In The Disappearance of Childhood, first published in 1982, Neil Postman writes the following:

Without a clear concept of what it means to be an adult, there can be no clear concept of what it means to be a child. Thus, the idea … that our electric information environment is ‘disappearing’ childhood … can also be expressed by saying that our electric information environment is disappearing adulthood.

How so, you ask?

[A]dulthood is largely a product of the printing press. Almost all of the characteristics we associate with adulthood are those that are (and were) either generated or amplified by the requirements of a fully literate culture: the capacity for self-restraint, a tolerance for delayed gratification, a sophisticated ability to think conceptually and sequentially, a preoccupation with both historical continuity and the future, a high valuation of reason and hierarchical order. As electric media move literacy to the periphery of culture and take its place at the center, different attitudes and character traits come to be valued and a new diminished definition of adulthood begins to emerge.

To be clear, Postman is obviously not talking about the number of years one has been alive. Rather he is talking about a social reality — the idea of adulthood, a particular model of what constitutes adulthood — not a biologically given reality.

Postman chooses to begin elaborating this claim with a discussion of “political consciousness and judgment in a society in which television carries the major burden of communicating political information,” about which he has the following to say:

In the television age, political judgment is transformed from an intellectual assessment of propositions to an intuitive and emotional response to the totality of an image. In the television, people do not so much agree or disagree with politicians as like or dislike them. Television redefines what is meant by ‘sound political judgment’ by making it into an aesthetic rather than a logical matter.

How might we update this discussion of television to account for digital media, especially social media? There’s a hint in the way we refer to the people who engage with each. We tend to talk about television’s audience or its viewers and of social media users. Social media users, in other words, are not merely passive consumers of media. By drawing us in as active participants, social media weaponizes the superficiality engendered by television. We might also say that “sound political judgment” becomes a matter not only of the candidate’s aesthetic but of our own aesthetic as well.

Finally, here’s a passage from Rudolph Arnheim quoted by Postman:

We must not forget in the past the inability to transport immediate experience and to convey it to others made the use of language necessary and thus compelled the human mind to develop concepts. For in order to describe things one must draw the general from the specific; one must select, compare, think. When communication can be achieved by pointing with the finger, however, the mouth grows silent, the writing hand stops, and the mind shrinks.

That’s a strong claim right there at the end. It was made in 1935. Maybe we should hesitate to put the matter quite so starkly. Perhaps we can simply say not that the mind shrinks, but that certain habits of thought atrophy or are left underdeveloped. In any case, I was struck by this paragraph because it seemed a useful way of characterizing online arguments conducted with memes. Replace “pointing a finger” with “posting a meme.” Of course, the point is that calling these arguments is all wrong. Posting a meme to make a point is like shouting QED without ever having presented your proofs. The argument, such as it is, is implicit and it is taken in at a glance, it is grasped intuitively. There’s very little room for persuasion in this sort of exchange.

The Ethics of Information Literacy

Yesterday, I caught Derek Thompson of The Atlantic discussing the problem of “fake news” on NPR’s Here and Now. It was all very sensible, of course. Thompson impressed upon the audience the importance of media literacy. He urged listeners to examine the provenance of the information they encounter. He also cited an article that appeared in US News & World Report about teaching high schoolers how to critically evaluate online information. The article, drawing on the advice of teachers, presented three keys: 1. Teach teens to question the source, 2. Help students identify credible sources, and 3. Give students regular opportunities to practice vetting information.

This is all fine. I suspect the problem is not limited to teens–an older cohort appears just as susceptible, if not more so, to “fake news”– but whatever the case, I spend a good deal of time in my classes doing something like what Thompson recommended. In fact, on more than one occasion, I’ve claimed that among the most important skills teachers can impart to students is the ability to discern the credible from the incredible and the serious from the frivolous. (I suspect the latter distinction is the more important and the more challenging to make.)

But we mustn’t fall into the trap of believing that this is simply a problem of the intellect to be solved with a few pointers and a handful of strategies. There is an ethical dimension to the problem as well because desire and virtue bear upon knowing and understanding. Thompson himself alludes to this ethical dimension, but he speaks of it mostly in the language of cognitive psychology–it is the problem of confirmation bias. This is a useful, but perhaps too narrow way of understanding the problem. However we frame it though, the key is this: We must learn to question more than our sources, we must also question ourselves.

I suggest a list of three questions for students, and by implication all of us, to consider. The first two are of the standard sort: 1. Who wrote this? and 2. Why should I trust them?

It would be foolish, in my view, to pretend that any of us can be independent arbiters of the truthfulness of claims made in every discipline or field of knowledge. It is unreasonable to expect that we would all become experts in every field about which we might be expected to have an informed opinion. Consequently, it is better to frame critical examination of sources as a matter of trustworthiness. Can I determine whether or not I have cause to trust the author or the organization that has produced the information I am evaluating? Of course, trustworthiness does not entail truthfulness or accuracy. When trustworthy sources conflict, for instance, we may need to make a judgment call or we might find ourselves unable to arbitrate the competing claims. It inevitably gets complicated.

The third question, however, gets at the ethical dimension: 3. Do I want this to be true?

This question is intended as a diagnostic tool. The goal is to reveal, so far as we might become self-aware about such things, our biases and sympathies. There are three possible answers: yes, no, and I don’t care. In each case, a challenge to discernment is entailed. If I want something to be true, and there may be various reasons for this, then I need to do my best to reposition myself as a skeptical critic. If I do not want something to be true, then I need to do my best to reposition myself as a sympathetic advocate. A measure of humility and courage are required in each case.

If I do not care, then there is another sort of problem to overcome. In this case, I may be led astray by a lack of care. I may believe what I first encounter because I am not sufficiently motivated to press further. Whereas it is something like passion or pride that we must guard against when we want to believe or disbelieve a claim, apathy is the problem here.

When I have taught classes on ethics, it has seemed to me that the critical question is not, as it is often assumed to be, “What is the right thing to do?” Rather, the critical question is this: “Why should someone desire to learn what is right and then do it?”

Likewise with the problem of information literacy. It is one thing to be presented with a set of skills and strategies to make us more discerning and critical. It is another, more important thing, to care about the truth at all, to care more about the truth than about being right.

In short, the business of teaching media literacy or critical thinking skills amounts to a kind of moral education. In a characteristically elaborate footnote in “Authority and American Usage,” David Foster Wallace got at this point, although from the perspective of the writer. In the body of his essay, Wallace writes,  “the error that Freshman Composition classes spend all semester trying to keep kids from making—the error of presuming the very audience-agreement that it is really their rhetorical job to earn.” The footnote to this sentence adds the following, emphasis mine:

Helping them eliminate the error involves drumming into student writers two big injunctions: (1) Do not presume that the reader can read your mind — anything you want the reader to visualize or consider or conclude, you must provide; (2) Do not presume that the reader feels the same way that you do about a given experience or issue — your argument cannot just assume as true the very things you’re trying to argue for. Because (1) and (2) are so simple and obvious, it may surprise you to know that they are actually incredibly hard to get students to understand in such a way that the principles inform their writing. The reason for the difficulty is that, in the abstract, (1) and (2) are intellectual, whereas in practice they are more things of the spirit. The injunctions require of the student both the imagination to conceive of the reader as a separate human being and the empathy to realize that this separate person has preferences and confusions and beliefs of her own, p/c/b’s that are just as deserving of respectful consideration as the writer’s. More, (1) and (2) require of students the humility to distinguish between a universal truth (‘This is the way things are, and only an idiot would disagree’) and something that the writer merely opines (‘My reasons for recommending this are as follows:’) . . . . I therefore submit that the hoary cliché ‘Teaching the student to write is teaching the student to think’ sells the enterprise way short. Thinking isn’t even half of it.

I take Wallace’s counsel here to be, more or less, the mirror image of the counsel I’m offering to us as readers.

Finally, I should say that all of the preceding does not begin to touch on much of what we would also need to consider when we’re thinking about media literacy. Most of the above deals with the matter of evaluating content, which is obviously not unimportant, and textual content at that. However, media literacy in the fullest sense would also entail an understanding of more subtle effects arising from the nature of the various tools we use to communicate content, not to mention the economic and political factors conditioning the production and dissemination of information.


If you’ve appreciated what you’ve read, consider supporting the writer.

Where Have All The Public Intellectuals Gone

“Professors, we need you!” announces the title of a Nicholas Kristof op-ed in the NY Times. Kristof goes on to lament the dearth of public intellectuals actively informing American culture regarding “today’s great debates.” Kristof blames this regrettable state of affairs on a series of predictable culprits: tedious academic writing, pressure to publish arcane scholarship in obscure journals, and hyper-specialized areas of research with little bearing on public life.

The responses I’ve read to Kristof’s column tend to grant that Kristof, almost despite himself, has put his finger on something, but go on to explain why his analysis is mostly flawed. Take as one example Corey Robin’s lengthy response: “Look Who Nick Kristof’s Saving Now.”

Part, but only part, of Robin’s response is to point to a host of intellectuals that would very much like to be public intellectuals and certainly have what it takes to fill that role admirably. He even speaks highly of grad students that are already making a name for themselves with timely, intelligent, well-written pieces on matters of public consequence.

And, indeed, many of us I’m sure could supplement Robin’s list of publicly engaged intellectuals. The problem as I see it is not a lack of intellectuals, it is the absence of a public.

To be clear, I don’t mean to suggest that there is not an intelligent reading public that craves informed, well-crafted pieces that bear on matters of public consequence. That may also be true, but I only believe it on my more cynical days. Of course, I won’t tell you what percentage of my days those cynical ones take up.

What I do mean by saying that there is no public may be better put by saying that there is no public sphere, and I mean this from a media ecological perspective. Our media ecosystem makes it easier than ever for intellectuals to make their work public, but harder than ever to achieve the status of public intellectual, i.e. someone who is both widely known and widely respected.

Kristof may or may not have had this in mind, but we might think of a public intellectual as someone who specializes in intellectual labor and makes that labor publicly accessible. If this is what we mean by public intellectuals, then, as Robins and other have pointed out, we’re surrounded by public intellectuals. The Internet is crawling with them. We might even say of public intellectuals what Jack says of clever people in “The Importance of Being Earnest”:

“I am sick to death of cleverness. Everybody is clever nowadays. You can’t go anywhere without meeting clever people. The thing has become an absolute public nuisance. I wish to goodness we had a few fools left.”

But 1,000 intellectuals writing publicly may not amount to a single public intellectual in the more traditional sense that Kristof has a hankering for. The more traditional sort of public intellectual, the one who is widely known, widely respected, and influential may simply have been a function of late print culture, a techno-cultural moment when the last ripples of Enlightenment ideals of rational and civil democratic discourse reached the public through newly-minted mass media. The proliferation of media outlets–the 400+ television stations, the countless forums and blogs and Internet magazines, Twitter, etc.–spawns niche markets and this makes it quite hard for someone to get the kind of audience and across-the-board respect that the title of public intellectual seems to suggest. When there were fewer gatekeepers of public opinion, and the remnants of a cultural consensus to work with, it would’ve been easier to achieve the status of public intellectual.

I realize that last paragraph trades in big generalizations and leaves out a whole host of factors. Feel free to pick it apart. I’ll leave aside all of the Postman-esque arguments you might imagine for yourself about our entertaining ourselves to death, etc. I’ll also leave aside the fact that a good deal of academic work in a variety of disciplines has been devoted to the critique of a universal public reason that the possibility of a public intellectual assumes.

One last thought: It may be that the craving for public intellectuals is a kind of nostalgic longing for a time when we could reasonably imagine that even though we ourselves couldn’t get an intellectual grip on the complexities of modern society, out there, somewhere, there were smart people at the controls. These mythical public intellectuals we long for were those whose cultural function was to reassure us with their calm, accessible, and smart talk that people who knew what they were doing were steering the ship.[1] I suspect the unnerving truth is that the trade-off for the benefits of an unfathomably complex technological society is the disquieting reality that understanding is now beyond the reach of any intellectual, public or otherwise. 

______________________________________

[1] The same function is served by the cabals of conspiracy theory dreams.

Walter Ong on Romanticism and Technology

The following excerpts are taken from an article by Walter Ong titled, “Romantic Difference and Technology.” The essay can be found in Ong’s Rhetoric, Romance, and Technology: Studies in the Interaction of Expression and Culture.

Ong opens by highlighting the lasting significance of the Romantic movement:

“Romanticism has not been a transient phenomenon. Most and perhaps even all literary and artistic, not to mention scientific, movements since the romantic movement appear to have been only further varieties of romanticism, each in its own way.”

He follows by characterizing subsequent literary and cultural movements as variations of romanticism:

“Early Victorian is attenuated romantic, late Victorian is recuperated romantic, fed on Darwin, Marx, and Comte, the American frontier and the American Adam are primitivist romantic, imagism and much other modern poetry is symbolist romantic, existentialism is super-charged or all-out romantic, programmatic black literature is alien-selfhood romantic, and beatnik and hippie performance is disenfranchized [sic] romantic. Insofar as it is an art form or a substitute for literature, and in other guises, too, activism is most certainly idealist romantic.”

Here is how Ong construes the link between Romanticism and technology, specifically technologies of the word:

“Romanticism and technology, as we shall be suggesting, are mirror images of each other, both being products of man’s dominance over nature and of the noetic abundance which had been created by chirographic and typographic techniques of storing and retrieving knowledge and which had made this dominance over nature possible.”

In other words, print acted as a kind of safety-net that encouraged intellectual daring, both technologically and literarily. This hypothesis depends upon Ong’s understanding of thought and knowledge in oral cultures. The first sentence is about as close to Ong in a nutshell as you’re going to get:

“Any culture knows only what it can recall. An oral culture, by and large, could recall only what was held in mnemonically serviceable formulas. In formulas thought lived and moved and had its being. This is true not only of the thought in the spectacularly formulary Homeric poems but also of the thought in the oratorically skilled leader or ordinarily articulate warrior or householder of Homeric Greece. In an oral culture, if you cannot think in formulas you cannot think effectively. Thought totally oral in implementation has specifically limited, however beautiful, configurations. A totally oral folk can think some thoughts and not others. It is impossible in an oral culture to produce, for example, the kind of thought pattern in Aristotle’s Art of Rhetoric or in any comparable methodical treatises….”

Here are some highlights of what follows for Ong:

“A typical manifestation of romanticism on which we have focused is interest in the remote, the mysterious, the inaccessible, the ineffable, the unknown. The romantic likes to remind us of how little we know. If we view romanticism in terms of the development of knowledge as we are beginning to understand this development, it is little wonder that as a major movement romanticism appeared so late. From man’s beginnings perhaps well over 500,000 years ago until recent times […] knowledge had been in short supply. To keep up his courage, man had continually to remind himself of how much he knew, to flaunt the rational, the certain, the definite and clear and distinct.”

Further:

“Until print had its effect, man still necessarily carried a heavy load of detail in his mind [….] With knowledge fastened down in visually processed space, man acquired an intellectual security never known before [….] It was precisely at this point that romanticism could and did take hold. For man could face into the unknown with courage or at least equanimity as never before.”

Finally:

“[…] romanticism and technology can be seen to grow out of the same ground, even though at first blush the two appear diametrically opposed, the one, technology, programmatically rational, the other, romanticism, concerned with transrational or arational if not irrational reality [….] romanticism and technology appear at the same time because each grows in its own way out of a noetic abundance such as man had never known before. Technology uses the abundance for practical purposes. Romanticism uses it for assurance and as a springboard to another world.”

This strikes me as a bold and elegant thesis. Does it hold up? Ong paints with some broad strokes, and particularly where he discusses what oral culture could and could not have thought we may want to consider how sure we could ever be of such a claim. That said, Ong’s thesis naturally encourages us to explore what transformations of thought and culture are encouraged by digital archives, databases, and artificial memories.

More on Ong: “Memory, Writing, Alienation.”


You can subscribe to The Convivial Society, my newsletter about technology and society,  here.

Studied Responses: Reactions to bin Laden’s Death

Image: CNN Belief Blog

In the moments, hours, and days following the announcement of Osama bin Laden’s death I was repeatedly struck by the amount of attention paid to the manner in which Americans were responding to his death.  Almost immediately I began to pick up notes of concerned introspection about the response (e.g., the jubilant crowds gathered at the White House and Ground Zero), and what ought to be the appropriate response.

This introspection appears to have been most pronounced within religious circles.  At Christianity Today, Sarah Pulliam Bailey gathered together Tweets from a number of evangelical Christian leaders and bloggers addressing the question, “How should Christians respond to Osama bin Laden’s death?”  A sizable comment thread formed below the post.  At the religion and media web site, Get Religion, in a post titled “Churches respond to Osama’s death,” we get another round of links to church leaders writing about the appropriate response to the killing of bin Laden.

The topic, however, was also prominent in the more mainstream media.  NPR, for example, ran a short piece titled “Is It Wrong to Celebrate Bin Laden’s Death” and another piece focused on bin Laden’s death titled “Is Celebrating Death Appropriate?”  In the former story we get the following odd piece of reflection:

Laura Cunningham, a 22-year-old Manhattan reveler — gripping a Budweiser in her hand and sitting atop the shoulders of a friend — was part of the crowd at ground zero in the wee hours Monday. As people around her chanted “U-S-A,” Cunningham was struck by the emotional response. She told New York Observer: “It’s weird to celebrate someone’s death. It’s not exactly what we’re here to celebrate, but it’s wonderful that people are happy.”

I say “odd,” because it is not clear that this young lady knew what or why she was celebrating.  “But it’s wonderful that people are happy”?  What?

The NY Times also ran a story titled, “Celebrating a Death: Ugly, Maybe, but Only Human.”  And, finally, in case you are interested, Noam Chomsky would also like you to know about his reaction to Osama’s death, although I imagine you can guess.  Additionally, at CNN’s Belief Blog, you can read “Survey:  Most Americans say its wrong to celebrate bin Laden’s death,” and Stephen Prothero’s reflections on the aforementioned survey.  You get the idea.

So all of this strikes me as rather interesting.  For one thing, I can’t really imagine this sort of self-awareness permeating the responses of previous generations to historical events of this sort.  Of course, this may be because this event is sui generis, although I doubt that is quite right.  It seems rather another instance of the self-reflexiveness and self-reference that has become a characteristic of our society.  I might push this further by noting that this post just adds another layer, another mirror, as I reflect on the reflections.  My usual explanation for this hypertrophied self-awareness is the collapse of taken-for-granted social structures and customs and the correlated rise of the liberated, spontaneous self.  The spontaneous self as it turns out is not that spontaneous; rather it is performed.  Performance is studied and aware of itself; conscious of its every response.  Naturally then, we are asking at the cultural level whether our “spontaneous” celebrations were appropriate.  Did we play this part right?

This posture seems to me to lack a certain degree of integrity, in the sense that our way of being in the world is not integrated; very little comes naturally, our actions all feel rather artificial.  Perhaps especially at those times when we most wish we could just be fully in the moment, we rather feel a certain anxiety about feeling the right way — are we feeling the way we are supposed to be feeling, etc.  However, the integrated self is also somewhat opaque to itself; it is capable of acting literally without thought, and thus perhaps thoughtlessly.

I’ll resist the temptation to provide a concluding paragraph that wraps things up neatly with a fresh insight.  More of an aspiration than a temptation, I suppose, if the insight just isn’t there.