Digital Media and the Revenge of Politics

[Caveat lector: More so than usual, the following is an exercise in thinking out loud. I send it out into the ether to be battered into shape.]

Near the close of my last post, I wrote, “The arc of digital media is bending toward epistemic nihilism.”

It’s a line to which I frequently resort as a way of addressing a variety of developments in the sphere of digital media that have, as I see it, eroded our confidence in the possibility of public knowledge. I’m using the phrase public knowledge to get at what we believe together that is also of public consequence. This is an imperfect distinction worth teasing out, but I’m just going to let it go at that right now.

When I use that line about the arc of digital media, I have in mind phenomena like the facility with which digital media can be manipulated and, more recently, the facility with which realistic digital media can be fabricated. I’m thinking as well of the hyper-pluralism that is a function of the way digital media connect us, bringing conflicting claims, beliefs, and narratives into close proximity. “The global village,” McLuhan told us, “is a place of very arduous interfaces and very abrasive situations.”

It occurs to me, though, that it might be worth making a clarification:  digital media does not create the conditions out of which the problem arises.

I’ve thought now and again about how we are recapitulating certain aspects of the early modern history of Europe. At some point last year I shot off an off the cuff tweet to this effect: “Thesis: If the digital revolution is analogous to the print revolution, then we’re entering our Wars of Religion phase.”

Although the story is more complicated than this, there is something to be said for framing the emergence of the modern world as a response to an epistemic crisis occasioned by the dissolution of the what we might think of as the medieval world picture (see Stephen Toulmin’s Cosmopolis: The Hidden Agenda of Modernity, for example).

The path that emerged as a way toward a solution to that crisis amounted to a quest certainty that took objectivity, abstraction, and neutrality as methodological pre-conditions for both the progress of science and politics, that is for re-emergence of public knowledge. The right method, the proper degree of alienation from the particulars of our situation, translations observable phenomena into the realm mathematical abstraction—these would lead us away from the uncertainty and often violent contentiousness that characterized the dissolution of the premodern world picture. The idea was to reconstitute the conditions for the emergence of public truth and, hence, public order.

Technology (or, better, technologies) plays an essential role in this story, but the role that it plays varies and shifts over time. Early on, for example, in the form of the printing press it accelerates the crisis of public knowledge, generating the pluralism of truth claims that undermine the old consensus. The same technology also comes to play a critical role in creating the conditions under which modern forms of public knowledge can emerge by sustaining the plausibility of a realm of cool, detached reason.

Consider as well how we impute to certain technologies the very characteristics we believe essential to public knowledge in the modern world (objectivity, neutrality, etc.). Think of photography, for example, and the degree to which we tend to believe that a photographic image is an objective and thus trustworthy representation of the truth of things. More recently, algorithms have been burdened with similar assumptions. Because they are cutting edge technologies feeding off of “raw data” some believe that they will necessarily yield unbiased and objectively true results. The problems with this view are, of course, well documented (here and here, for example).

The general progression has been to increasingly turn to technologies in order to better achieve the conditions under which we came to believe public knowledge could exist. Our crisis stems from the growing realization that our technologies themselves are not neutral or objective arbiters of public knowledge and, what’s more, that they may now actually be used to undermine the possibility of public knowledge.

The point, then, is this:  It’s not that digital media necessarily leads to epistemic nihilism, it’s that digital media leads to epistemic nihilism given the conditions for public knowledge that have held sway in the modern world. Seen in this light, digital media, like print before it, is helping dissolve an older intellectual and political order. It is doing so because the trajectory we set out on 400 years ago or so has more or less played itself out.

One last thought for now. According to Arendt, “The trouble is that factual truth, like all other truth, peremptorily claims to be acknowledged and precludes debate, and debate constitutes the very essence of political life. The modes of thought and communication that deal with truth, if seen from the political perspective, are necessarily domineering; they don’t take into account other people’s opinions, and taking these into account is the hallmark of all strictly political thinking.”

In other words, what if the technocratic strain within modern political culture, the drive to ground politics in truth (or facts) is actually the drive to transcend the political altogether? What if the age of electronic/mass media, the brief interregnum between the high water mark of the age of literacy and the digital age, was in some ways merely a momentary deviation from the norm during which politics could appear to be about consensus rather than struggle? In this light the political consequences of digital media might simply be characterized as the revenge of politics, although in a different and often disconcerting mode.

Digital Bunburying

Here are a few lines from Oscar Wilde’s The Importance of Being Earnest, which reveal the play’s key plot device. Stay with me, it’s going somewhere.

ALGERNON  Well, that is exactly what dentists always do. Now, go on! Tell me the whole thing. I may mention that I have always suspected you of being a confirmed and secret Bunburyist; and I am quite sure of it now.

JACK  Bunburyist? What on earth do you mean by a Bunburyist?

ALGERNON  I’ll reveal to you the meaning of that incomparable expression as soon as you are kind enough to inform me why you are Ernest in town and Jack in the country.

[…]

JACK  My dear fellow, there is nothing improbable about my explanation at all. In fact it’s perfectly ordinary. Old Mr. Thomas Cardew, who adopted me when I was a little boy, made me in his will guardian to his grand-daughter, Miss Cecily Cardew. Cecily, who addresses me as her uncle from motives of respect that you could not possibly appreciate, lives at my place in the country under the charge of her admirable governess, Miss Prism.

ALGERNON Where in that place in the country, by the way?

JACK  That is nothing to you, dear boy. You are not going to be invited . . . I may tell you candidly that the place is not in Shropshire.

ALGERNON . I suspected that, my dear fellow! I have Bunburyed all over Shropshire on two separate occasions. Now, go on. Why are you Ernest in town and Jack in the country?

JACK  My dear Algy, I don’t know whether you will be able to understand my real motives. You are hardly serious enough. When one is placed in the position of guardian, one has to adopt a very high moral tone on all subjects. It’s one’s duty to do so. And as a high moral tone can hardly be said to conduce very much to either one’s health or one’s happiness, in order to get up to town I have always pretended to have a younger brother of the name of Ernest, who lives in the Albany, and gets into the most dreadful scrapes. That, my dear Algy, is the whole truth pure and simple.

[…]

ALGERNON  What you really are is a Bunburyist. I was quite right in saying you were a Bunburyist. You are one of the most advanced Bunburyists I know.

JACK  What on earth do you mean?

ALGERNON  You have invented a very useful younger brother called Ernest, in order that you may be able to come up to town as often as you like. I have invented an invaluable permanent invalid called Bunbury, in order that I may be able to go down into the country whenever I choose. Bunbury is perfectly invaluable. If it wasn’t for Bunbury’s extraordinary bad health, for instance, I wouldn’t be able to dine with you at Willis’s to-night, for I have been really engaged to Aunt Augusta for more than a week.

And now here is a proposal that was brought to my attention today:

The tweet, of course, is in jest—in the spirit of gallows humor, I’d suggest—but it usefully brings together two trends that should concern us: ubiquitous surveillance and deepfake technology. What it suggests is that you need a fake version of yourself to escape the ubiquity of surveillance, although, of course, the ubiquity of deepfake technology also appears to require ever more

Relatedly, take a moment to visit This Person Does Not Exist. The image you’ll see is of a person who … does not exist. The image is generated by generative adversarial networks. Hit refresh and you’ll get another.

After you’ve perused a few images, read Kyle McDonald’s “How to recognize fake AI-generated images.” Things to look for include weird teeth, asymmetry, surreal backgrounds, mismatched or missing earrings. That helps. For now.

More news on the deepfake front: “The creators of a revolutionary AI system that can write news stories and works of fiction – dubbed ‘deepfakes for text’ – have taken the unusual step of not releasing their research publicly, for fear of potential misuse.”

Here’s a link to OpenAI’s post about their work: Better Language Models and Their Implications. In it they explain their decision not to release their research:

Due to concerns about large language models being used to generate deceptive, biased, or abusive language at scale, we are only releasing a much smaller version of GPT-2 along with sampling code. We are not releasing the dataset, training code, or GPT-2 model weights. Nearly a year ago we wrote in the OpenAI Charter: “we expect that safety and security concerns will reduce our traditional publishing in the future, while increasing the importance of sharing safety, policy, and standards research,” and we see this current work as potentially representing the early beginnings of such concerns, which we expect may grow over time. This decision, as well as our discussion of it, is an experiment: while we are not sure that it is the right decision today, we believe that the AI community will eventually need to tackle the issue of publication norms in a thoughtful way in certain research areas.

All of this recalls Max Read’s recent piece, “How Much of the Internet is Fake?”

How much of the internet is fake? Studies generally suggest that, year after year, less than 60 percent of web traffic is human; some years, according to some researchers, a healthy majority of it is bot. For a period of time in 2013, the Times reported this year, a full half of YouTube traffic was “bots masquerading as people,” a portion so high that employees feared an inflection point after which YouTube’s systems for detecting fraudulent traffic would begin to regard bot traffic as real and human traffic as fake. They called this hypothetical event “the Inversion.”

The arc of digital media is bending toward epistemic nihilism, as I’ve been inclined to put it on Twitter; it amounts to a casual indifference to much of what has counted as evidence in recent memory. There is, of course, something decidedly Baudrillardian about this. But, of course, even to suggest that Baudrillard has something to tell us about our moment is to acknowledge that digital media has not necessarily initiated the trajectory, it has merely accelerated our descent, exponentially so perhaps.

Remembering Iris Murdoch

The philosopher and novelist Iris Murdoch died on this day in 1999. I’ve appreciated what I have read of her work, although I’m sorry to admit that I’ve yet to read any of her fiction.

She was part of a set of formidable twentieth century philosophers that included Elizabeth Anscombe, Philippa Foot, and Mary Midgley, who passed away last year. You can read about the cohort in this essay, which focuses on Foot. Here’s an excerpt:

“As Murdoch put it to a New Yorker journalist, what they were united in denying was the claim that ‘the human being was the monarch of the Universe, that he constructed his values from scratch’. The four of them, by contrast, were interested in ‘the reality that surrounds man – transcendent or whatever’.

Murdoch’s ‘or whatever’ was a reference to the things that divided them: she herself was drawn to a vision of a Universe where ‘the Good’, if not God, was real; Anscombe was a devout Catholic; Foot – in her own words – was a ‘card-carrying atheist’. But all three of them took seriously the claim that moral judgments are an attempt, however flawed in particular cases, to get at something true independently of human choices.”

My first encounter with Murdoch came years ago through this selection from The Sovereignty of the Good in which Murdoch describes the relationship between love and learning as well as the value of the intellectual virtues:

“If I am learning, for instance, Russian, I am confronted by an authoritative structure which commands my respect. The task is difficult and the goal is distant and perhaps never entirely attainable. My work is a progressive revelation of something which exists independently of me. Attention is rewarded by a knowledge of reality. Love of Russian leads me away from myself towards something alien to me, something which my consciousness cannot take over, swallow up, deny or make unreal. The honesty and humility required of the student — not to pretend to know what one does not know — is the preparation for the honesty and humility of the scholar who does not even feel tempted to suppress the fact which damns his theory.”

Elsewhere in The Sovereignty of the Good, she reflected on relationship between love, seeing, attention, and freedom:

“It is in the capacity to love, that is to see, that the liberation of the soul from fantasy consists. The freedom which is a proper human goal is the freedom from fantasy, that is the realism of compassion. What I have called fantasy, the proliferation of blinding self-centered aims and images, is itself a powerful system of energy, and most of what is often called ‘will’ or ‘willing’ belongs to this system. What counteracts the system is attention to reality inspired by, consisting of, love.

She added,

Freedom is not strictly the exercise of the will, but rather the experience of accurate vision which, when this becomes appropriate, occasions actions. It is what lies behind and in between actions and prompts them that is important, and it is this area which should be purified. By the time the moment of choice has arrived the quality of attention has probably determined the nature of the act.”

Her discussion of attention, incidentally, was influenced by Simone Weil from whom she says she borrowed the term “to express the idea of a just and loving gaze directed upon an individual reality.”

One last excerpt:

“Words are the most subtle symbols which we possess and our human fabric depends on them. The living and radical nature of language is something which we forget at our peril.”

Needless to say, I think her work remains relevant and even urgent.

For more about Murdoch, see this recent essay in LARB, “Innumerable Intentions and Charms”: On Gary Browning’s “Why Iris Murdoch Matters.”

Don’t Romanticize the Present

Steven Pinker and Jason Hickel have recently engaged in a back-and-forth about whether or not global poverty is decreasing. The first salvo was an essay by Hickel in the Guardian targeting claims made by Bill Gates. Pinker responded here, and Hickel posted his rejoinder at his site.

I’ll let you dive in to the debate if you’re so inclined. The exchange is of interest to me, in part, because evaluations of modern technology are often intertwined with this larger debate about the relative merits of what, for brevity’s sake, we may simply call modernity (although, of course, it’s complicated).

I’m especially interested in a rhetorical move that is often employed in these kinds of debates:  it amounts to the charge of romanticizing the past.

So, for example, Pinker claims, “Hickel’s picture of the past is a romantic fairy tale, devoid of citations or evidence.” I’ll note in passing Hickel’s response, summed up in this line: “All of this violence, and much more, gets elided in your narrative and repackaged as a happy story of progress. And you say I’m the one possessed of romantic fairy tales.” Hickel, in my view, gets the better of Pinker on this point.

In any case, the trope is recurring and, as I see it, tiresome. I wrote about it quite early in the life of this blog when I explained that I did not, in fact, wish to be a medieval peasant.

More recently, Matt Stoller tweeted, “When I criticize big tech monopolies the bad faith response is often a variant of ‘so you want to go back to horses and buggies?!?'” Stoller encountered some variant of this line so often that he was searching for a simple term by which to refer to it. It’s a Borg Complex symptom, as far as I’m concerned.

At a forum about technology and human flourishing I recently attended, the moderator, a fine scholar whose work I admire, explicitly cautioned us in his opening statements against romanticizing the past.

It would take no time at all to find similar examples, especially if you expand “romanticizing the past” to include the equally common charge of reactionary nostalgia. Both betray a palpable anxiousness about upholding the superiority of the present.

I understand the impulse, I really do. I think it was from Alan Jacobs that I first learned about the poet W. H. Auden’s distinction between those whose tendency is to look longingly back at some better age in the past and those who look hopefully toward some ideal future:  Arcadians and Utopians respectively, he called them. Auden took these to be matters of temperament. If so, then I would readily admit to being temperamentally Arcadian. For that reason, I think I well understand the temptation and try to be on guard against it.

That said, stern warnings against romanticizing the past sometimes reveal a susceptibility to another temptation:  romanticizing the present.

This is not altogether surprising. To be modern is to define oneself by one’s location in time, specifically by being on the leading edge of time. Novelty becomes a raison d’être.

As the historian Michael Gillespie has put it,

… to think of oneself as modern is to define one’s being in terms of time. This is remarkable. In previous ages and other places, people have defined themselves in terms of their land or place, their race or ethnic group, their traditions or their gods, but not explicitly in terms of time …  To be modern means to be “new,” to be an unprecedented event in the flow of time, a first beginning, something different than anything that has come before, a novel way of being in the world, ultimately not even a form of being but a form of becoming.

Within this cultural logic, the possibility that something, anything, was better in the past is not only a matter of error, it may be experienced as a threat to one’s moral compass and identity. Over time, perhaps principally through the nineteenth century, progress displaced providence and, consequently, optimism displaced hope. The older theological categories were simply secularized. Capital-P Progress, then, despite its many critics, still does a lot of work within our intellectual and moral frameworks.

Whatever its sources, the knee-jerk charge of romanticizing the past or of succumbing to reactionary nostalgia often amounts to a refusal to think about technology or take responsibility for it.

As the late Paul Virilio once put it, “I believe that you must appreciate technology just like art. You wouldn’t tell an art connoisseur that he can’t prefer abstractionism to expressionism. To love is to choose. And today, we’re losing this. Love has become an obligation.”

We are not obligated to love technology. This is so not only because love, in this instance, ought not to be an obligation but also because there is no such thing as technology. By this I mean simply that technology is a category of dubious utility. If we allow it to stand as an umbrella term for everything from modern dentistry to the apparatus of ubiquitous surveillance, then we are forced to either accept modern technology in toto or reject it in toto. We are thus discouraged from thoughtful discrimination and responsible judgment. It is within this frame that the charge romanticizing the past as a rejoinder to any criticism of technology operates. And it is this frame that we must reject. Modern technology is not good by virtue of its being modern. Past configurations of the techno-social milieu are not bad by virtue of their being past.

We should romanticize neither the past nor the present, nor the future for that matter. We should think critically about how we develop, adopt, and implement technology, so far as it is in our power to do so. Such thinking stands only to benefit from an engagement with the past as, if nothing else, a point of reference. The point, however, is not a retrieval of the past but a better ordering of the present and future.

 

Listening To Those At The Hinges

A passing thought this evening: we should be attentive to the experience and testimony of those whose lives turn out to be the hinges on which one era closes and another opens.

There are several ways to parse that, of course, as many as there are ways of understanding what amounts to a new era. We might, for example, speak of it from any number of perspectives:  a new political era, a new economic era, a new artistic era, etc.

I’m thinking chiefly of new technological eras, of course.

And obviously, I’m thinking of a significant transitions, those whose ramifications spill out into various domains of our personal and social lives. One might think of the transitions marked by the advent of printing, electric power grids, or the automobile.

In cases drawn from the more distant past—printing, for instance, or even the development of writing—it may be harder to pinpoint a hinge generation because the changes played out at a relatively slow place. The closer we get to our present time, though, it would seem that transitions unfold more rapidly:  within a lifetime rather than across lifetimes.

The most obvious case in point, one that many of us are able to speak of from personal experience, is the transition from the world before the commercial internet to the world after. I’m among those old enough to have a living memory of the world before the internet; AOL came to my home as I approached twenty years of age. Perhaps you are as well, or perhaps you have no memory of a world in which the internet was not a pervasive fact of life.

I suspect the development of the smartphone is also similarly consequential. There are more of us, of course, who remember the world before smartphones became more or less ubiquitous in the developed world, but already there are those entering adulthood for whom that is not the case.

I was reminded of a couple of paragraphs from Michael Heim’s 1984 book on the meaning of word processing technology. Responding to those who might wonder whether it was too soon to take stock of a then-nascent technology, Heim writes,

“Yet it is precisely this point in time that causes us to become philosophical. For it is at the moment of such transitions that the past becomes clear as a past, as obsolescent, and the future becomes clear as destiny, a challenge of the unknown. A philosophical study of digital writing made five or ten years from now would be better than one written now in the sense of being more comprehensive, more fully certain in its grasp of the new writing. At the same time, however, the felt contrast with the older writing technology would have become faded by the gradually increasing distance from typewritten and mechanical writing. Like our involvement with the automobile, that with processing texts will grow in transparency—until it becomes a condition of our daily life, taken for granted.

But what is granted to us in each epoch was at one time a beginning, a start, a change that was startling. Though the conditions of daily living do become transparent, they still draw upon our energies and upon the time of our lives; they soon become necessary conditions and come to structure our lives. It is incumbent on us then to grow philosophical while we can still be startled, for philosophy, if Aristotle can be trusted, begins in wonder, and, as Heraclitus suggests, ‘One should not act or speak as if asleep.’”

I’m thinking about this just now after taking a look at Christopher Mims’s piece this morning in the Wall Street Journal, “Generation Z’s 7 Lessons for Surviving in Our Tech-Obsessed World.” Lesson six, for example, reads, “Gen Z thinks concerns about screens are overblown.”

My point is not so much that this is wrong, although I tend to think that it is, my point is that this isn’t really a lesson so much as it is the testimony of some people’s experience. As such it is fine, but it also happens to be the testimony of people who may not exactly have at least one relevant, if not critical, point of comparison. To put the matter more pointedly, the rejoinder that flits into my mind is simply this: What do they know?

That’s not entirely fair, of course. They know some things I don’t, I’m sure. But how do we form judgements when we can’t quite imagine the world otherwise? It is more than that, though. I suppose with enough information and a measure of empathy, one can begin to imagine a wold that is no longer the case. But you can’t quite feel it in the way that those with a living memory of the experience of being alive before the world turned over can.

If we care to understand the meaning of change, we should heed the testimony of those on whose lives the times have hinged. Their perspective and the kind of knowledge they carry, difficult to articulate as it may be, is unique and valuable.

As I have typed this post out, I had that sense that I’ve tread this ground before, and, indeed, I have written along very similar lines a few years back. If it is all the same with you, dear reader, I’m going to close with what I wrote then. I’m not sure that I could improve very much on it now. What follows draws on an essay by Jonathan Franzen. I know how we are all supposed to feel about Franzen, but just let that go for a moment. He was more right than wrong, I think, when he wrote, reflecting on the work of Karl Krauss, that “As long as modernity lasts, all days will feel to someone like the last days of humanity,” what he called our “personal apocalypses.”

This is, perhaps, a bit melodramatic, and it is certainly not all that could be said on the matter, or all that should be said. But Franzen is telling us something about what it feels like to be alive these days. It’s true, Franzen is not the best public face for those who are marginalized and swept aside by the tides of technological change, tides which do not lift all boats, tides which may, in fact, sink a great many. But there are such people, and we do well to temper our enthusiasm long enough to enter, so far as it is possible, into their experience. In fact, precisely because we do not have a common culture to fall back on, we must work extraordinarily hard to understand one another.

Franzen is still working on the assumption that these little personal apocalypses are a generational phenomenon. I’d argue that he’s underestimated the situation. The rate of change may be such that the apocalypses are now intra-generational. It is not simply that my world is not my parents’ world; it is that my world now is not what my world was a decade ago. We are all exiles now, displaced from a world we cannot reach because it fades away just as its contours begin to materialize. This explains why, as I wrote earlier this year, nostalgia is not so much a desire for a place or a time as it is a desire for some lost version of ourselves. We are like Margaret, who in Hopkins’ poem, laments the passing of the seasons, Margaret to whom the poet’s voice says kindly, “It is Margaret you mourn for.”

Although I do believe that certain kinds of change ought to be resisted—I’d be a fool not to—none of what I’ve been trying to get at in this post is about resisting change in itself. Rather, I think all I’ve been trying to say is this: we must learn to take account of how differently we experience the changing world so that we might best help one another as we live through the change that must come. That is all.


If you’ve appreciated what you’ve read, consider supporting or tipping the writer.