Privacy Is Not Dying, We’re Killing It

I’ve been thinking again about how the virtues we claim to desire are so often undermined by the technologically-mediated practices in which we persistently engage. Put very simply, what we say we value is frequently undermined by what we actually do. We often fail to recognize this because we tend to forget that how we do something (and what we do it with) can be as important, in the long run anyway, as what we do or even why we do it. This is one way of understanding the truth captured by McLuhan’s dictum, “the medium is the message.”

The immediate catalyst for this line of thought was a recent Kara Swisher op-ed about Jeff Bezos’s leaked tweets and the “death of privacy.” In it, Swisher writes, “The sweet nothings that Mr. Bezos was sending to one person should not have turned into tweets for the entire world to see and, worse, that most everyone assumed were O.K. to see.”

She concludes her piece with these reflections:

Of course, we also do this to ourselves. Not to blame the victim (and there are plenty of those in this case, including Mr. Bezos), but we choose to put ourselves on display. Post your photos from your vacation to Aruba on the ever-changing wall of the performative museum that is Instagram? Sure! Write a long soliloquy about your fights with neighbors and great-uncles on Facebook? Sign me up! Tweet about a dishwasher mishap with a big box retailer on Twitter? Wait, that’s me.

I think you get my point here. We are both the fodder for and the creators of the noise pollution that is mucking up so much, including the national discourse. That national discourse just gave us Mr. Bezos as one day’s entertainment.

Are you not entertained? I am, and I am also ashamed.

It was refreshing to read that, frankly. More often than not in these cases, we tend to focus on the depredations of the tech companies involved or the technological accident or the bad actors, and, of course, there’s often plenty to focus on in each case. But this line of analysis sometimes lets us off the hook a bit too easily. Privacy may not be dead but it’s morphing, and it is doing so in part because of how we habitually conduct ourselves and how our tools mediate our perception.

So we say we value privacy, but we hardly understand what we mean by it. Privacy flourishes in the attention economy to the same degree that contentment flourishes in the consumer economy, which is to say not at all. Quietly and without acknowledging as much, we’ve turned the old virtue into a vice.

We want to live in public but also control what happens to the slices of life we publicize. Or we recoil at the thought of our foibles being turned into one day’s entertainment on Twitter but, as Swisher notes, we nonchalantly consume such entertainment when someone else is the victim.

Swisher’s “Are you not entertained?” obviously recalls Maximus’s demand of the audience in the film, Gladiator. It may seem like an absurd question, but let us at least consider it for a moment:  how different is a Twitter mob from the ancient audiences of the gladiatorial spectacle? Do we believe that such mobs can’t issue forth in “real world” violence or that they cannot otherwise destroy a life? One difference of consequence, I suppose, is that at least the ancient audience did not bathe itself in self-righteousness.

Naturally, all of this is just an extension of what used to be the case with celebrities in the age of electronic media and print tabloids. Digital media simply democratizes both the publicity and its consequences. Celebrity is now virality, and it could happen, potentially, to anyone, whether they want it to or not. The fifteen minutes of fame Warhol predicted have become whatever the life cycle of a hashtag happens to be, only the consequences linger far longer. Warhol did not anticipate the revenge of memory in the digital age.

The Discourse is Poisoned, But You Already Knew That

It occurred to me today that one of Nathaniel Hawthorne’s short stories may indirectly teach us something about the dysfunctional dynamics of social media discourse. The story is “Young Goodman Brown,” you may remember it from a high school or undergrad English class. It is one of Hawthorne’s most frequently anthologized tales.

The story is about a seemingly ordinary young man who ventures out to the forest for an undisclosed but apparently illicit errand, leaving behind his newlywed bride, Faith. Along the way, he meets a peculiar but distinguished man, who claims to have known his father and his father’s father. The distinguished gentleman turns out to be the Devil, of course, and, by and by, reveals to young goodman Brown how all of the town’s most upstanding citizens, ministers and civic leaders included, are actually his very own subjects. Brown even discovers that his dear Faith has found herself in midst of the infernal gathering in the forest.

At the climactic point, Brown cries out, “Faith! Faith! Look up to Heaven, and resist the Wicked One!”

What happens next, Hawthorne cloaks in mystery. Brown suddenly, as if awakening from a dream, finds himself alone in the quiet forest with no trace of another person anywhere. Although he remained uncertain about what had happened to him, he emerged a changed man. Now he could not look on anyone without seeing vice and hypocrisy. He trusted no one and imagined everyone, no matter how saintly, to be a vile hypocrite. “A stern, a sad, a darkly meditative, a distrustful, if not a desperate man, did he become, from the night of that fearful dream.”

Hawthorne, who, in my view, is particularly adept in his exploration of intersubjectivity, presents us with a story about knowledge acquired and innocence lost. The knowledge he gains, however, is putatively about the true nature of those around him, but the story leaves the status of that knowledge in doubt. It is enough, though, that Brown believes that he knows the truth. This knowledge, of course, poisons all his relationships and ultimately destroys him.

I recalled Hawthorne’s story this morning as I read Roger Berkowitz’s discussion of a recent essay by Peter Baehr, “Against unmasking: five techniques that destroy dialogue and how to avoid them.”  Berkowitz is the director of the Hannah Arendt Center for Politics and Humanities and his comments draw on Arendt’s On Revolution:

“The human heart is a place of darkness.” Thus, the demand for purity in motives is an impossible demand that “transforms all actors into hypocrites.”

For Arendt, “the moment the display of motives begins, hypocrisy begins to poison all human relations.” And unmasking is the name that Arendt gives to the politics of virtue, the demand that all hypocrites be unmasked.

The danger in unmasking is that all people must, as people, where [sic] masks. The word “person” is from the Latin persona, which means literally to “sound-through.” A person in ancient Rome was a citizen, one who wore the mask of citizenship. “Without his persona, there would be an individual without rights and duties, perhaps a ‘natural man’ … but certainly a politically irrelevant being.”

For his part, Baehr wrote,

Unmasking inverts people’s statements and makes them look foolish. It reduces a concept or a theory to the supposed ideological position of the writer. It trades on a mistaken concept of illusion. And, more generally, it burdens enquiry with a radical agenda of emancipation that people of different views have no reason to accept as valid. In politics, writers who adopt the unmasking style repeatedly treat other people not as fellow citizens with rival views of the good but as villains. In some revolutionary situations unmasking weaponization is the rhetoric of mass murder. And beyond these extremes, unmasking stokes mutual contempt.

In Hawthorne’s story, Brown ends up an exemplar of just such an unmasking style. He believes that he has seen all of his fellow citizens unmasked, and he treats them accordingly.

If you come to believe that you know the truth about someone or that you know their “real” self, then there is nothing else to learn. There is no need to listen or to observe. You can assume that every argument is conducted in bad faith. There can, in fact, be no dialogue at all. Conversation is altogether precluded.

In a recent essay that was widely discussed, philosopher Justin E. H. Smith explores these dynamics by way of that bingo card meme listing a set of predictable comments you can expect from a person that has been ideologically pigeonholed. The presumption is that such a person’s view “is not actually a considered view at all, but only an algorithmically predictable bit of output from the particular program he is running.”

“But something’s wrong here,” he goes on to say, commenting on what is appreciation of William Burroughs does and does not mean:

Burroughs does not in fact entail the others, and the strong and mature part of the self—that is to say the part that resists the forces that would beat all human subjectivity down into an algorithm—knows this. But human subjects are vanishingly small beneath the tsunami of likes, views, clicks and other metrics that is currently transforming selves into financialized vectors of data. This financialization is complete, one might suppose, when the algorithms make the leap from machines originally meant only to assist human subjects, into the way these human subjects constitute themselves and think about themselves, their tastes and values, and their relations with others.

“This gutting of our human subjecthood,” Smith goes on to argue, “is currently being stoked and exacerbated, and integrated into a causal loop with, the financial incentives of the tech companies. People are now speaking in a way that results directly from the recent moneyballing of all of human existence.”

I would add just one more dimension to these considerations. The unmasking style takes on a distinct quality on social media, and I’m not sure that Smith’s essay touches on this.

Imagine a young goodman Brown today venturing out into the dark forests of social media. What would be revealed to him by the distinguished gentlemen is not so much the moral failure or vices of his contemporaries but rather their supposed inauthenticity. It would be revealed to him that everyone is in on the moneyballing of human subjectivity, everyone is in on the attention racket.

Consider the Instagram egg. You know, the stock photo of an egg that quickly became the most “liked” image in Instagrams history. Writing about the account for Wired, Louise Matsakis explained,

the creator is clearly familiar with some of the engines of online fame. They tagged a number of people and publications that regularly cover viral memes like LADBible, Mashable, BuzzFeed, the Daily Mail, and YouTube star PewDiePie in the egg photo itself. Once the account took off, they began to sell “official” merchandise, including T-shirts that say “I LIKED THE EGG,” which go for $19.50.

Moreover,

The account behind the egg, which has nearly 6 million followers, is also an incredibly valuable marketing vehicle. Brands could spend thousands of dollars to advertise with it, according to sources in the influencer marketing industry who spoke with Recode. When the egg goes out of style, its creator could also sell the account or pivot to sharing another type of content with the massive audience they’ve already attracted.

There’s more, but you get the point. The lesson, of sorts, to which the author brought the articles was this: “The egg and other seemingly meaningless internet fads are beloved because they stand in contrast to the manicured, optimized content that often fills our social media feeds, especially from people like the Kardashians.”

I don’t know. It seems to me that the contrast is actually only superficial. They are all playing the game. Or rather, they are playing the game, the rest of us think we’re playing the game but, in fact, are simply getting played. But the point, for present purposes, is that we take for granted that, whether they are winning or losing, everyone is playing the attention game, which, we reason, tacitly perhaps, implies that they are by definition conducting themselves in bad faith. And that is why, to borrow the title of Smith’s essay, it’s all over.

It is one thing to believe that someone is a moral hypocrite and should not be trusted. It is another thing altogether to believe that no one is ever, whether in matters of politics or in the expression of their most inane affinities, doing anything more than posturing for attention of some kind or another. That poisons everything. And so long as we are engaging on social media, especially with people we do not otherwise know offline, it is virtually unavoidable. The structure of social media demands that we internally experience ourselves as performers calling on the attention of an audience. And we know, whether we hate ourselves for it or not, that the insidious metrics always threaten to enslave us. So, we assume, it must be with everyone else.

We are thus tempted simultaneously, and somewhat paradoxically, to believe that those we encounter online are necessarily involved in an inauthentic identity game and that we are capable of ascertaining the truth about them, even on the slimmest of evidence. (Viewed in this light, there is, in fact, a dialectical relationship between the kind of subjectivity Smith commends and the consequences of the moneyballing he decries.) Or, to put it another way, we believe we know the truth about everyone and the truth we know is that there is no truth to be known. So our public sphere takes on not a cynical quality, but a nihilistic one. That is the difference between believing that everyone is a moral hypocrite and believing that everyone is inauthentically posturing for attention.

It is difficult to see how any good thing can come out of these conditions, especially when they spill out beyond social media platforms, as they necessarily must.

Technology Is a Branch of Moral Philosophy

“Whether or not it draws on new scientific research technology is a branch of moral philosophy, not of science.”

Or so argued Paul Goodman in a 1969 essay for the New York Review of Books titled “Can Technology Be Humane?” I first encountered the line in Postman’s Technopoly and was reminded of it a few years back in a post by Nick Carr. I commented briefly on the essay around that same time, but found myself thinking about that line again more recently. Goodman’s essay, if nothing else, warrants our consideration for how it presages and informs some of the present discussion about humane technology.

Technology, Goodman went on to write in explanation of his claim,

“aims at prudent goods for the commonweal and to provide efficient means for these goods. At present, however, ‘scientific technology’ occupies a bastard position in the universities, in funding, and in the public mind. It is half tied to the theoretical sciences and half treated as mere know-how for political and commercial purposes. It has no principles of its own.”

What to do about this? Re-organize the whole technological enterprise:

“To remedy this—so Karl Jaspers in Europe and Robert Hutchins in America have urged—technology must have its proper place on the faculty as a learned profession important in modern society, along with medicine, law, the humanities, and natural philosophy, learning from them and having something to teach them. As a moral philosopher, a technician should be able to criticize the programs given him to implement. As a professional in a community of learned professionals, a technologist must have a different kind of training and develop a different character than we see at present among technicians and engineers. He should know something of the social sciences, law, the fine arts, and medicine, as well as relevant natural sciences.”

Clearly this was a far more robust program than contemporary attempts to shoehorn an ethics class into engineering and computer science programs, however noble the intent. Equally clearly, it almost impossible to imagine such a re-organization. That horizon of opportunity closed, if it was ever truly open.

Regarding the virtue of prudence, Goodman wrote,

“Prudence is foresight, caution, utility. Thus it is up to the technologists, not to regulatory agencies of the government, to provide for safety and to think about remote effects. This is what Ralph Nader is saying and Rachel Carson used to ask. An important aspect of caution is flexibility, to avoid the pyramiding catastrophe that occurs when something goes wrong in interlocking technologies, as in urban power failures. Naturally, to take responsibility for such things often requires standing up to the front office and urban politicians, and technologists must organize themselves in order to have power to do it.”

There’s much else to consider in the essay, even if only in the vein of considering an alternative historical reality that might have unfolded. In fact, though, there is much that remains usefully suggestive should we care to take stock and reconsider how we relate to technology. That last line certainly calls to mind recent efforts by tech workers at various firms, including Google,  to organize in opposition to projects they deemed immoral or unjust.

The Virgin and the Data Center

A reader passed along a link to a story in Motherboard about one of the world’s most powerful supercomputers (ranked 25th to be exact), which just happens to be housed in a deconsecrated church in Barcelona. Torre Girona Chapel is now part of the Polytechnic University of Catalonia, and it is home to the supercomputer known as MareNostrum 4. The site has been named the most beautiful data center in the world, which is an actual award handed out by Data Center Dynamics.

Here is one view of the exterior:

06._marenostrum_supercomputacion_supercomputer_supercomputing_bsc_bsc-cns_hpc_0

And here is one view of the inside:

2017_bsc_superordenador_marenostrum-4_barcelona-supercomputing-center-684x513

As my correspondent put it, “the commentary writes itself.”

It’s tempting to see in the repurposed church an allegory of sorts, science and technology vanquishing faith and religion. It may be closer to the truth, however, to see instead something more akin to a displacement: science and technology assume the place of faith and religion. And, in this case, we might put the matter more pointedly, data and computing power assuming the functions and roles once ascribed to the deity—source of all knowledge and arbiter of truth, for example. Ian Bogost’s 2015 essay, “The Cathedral of Computation,” comes to mind as do the dreams of immortality embodied in certain strands of transhumanism, strands which amount to what I like to think of as (post-)Christian fan fiction.

Relatedly, we might also see an apt illustration for the often forgotten entanglement of religion and technology in the western world, a story told well by the late David Noble in his classic work, The Religion of Technology.

“Modern technology and modern faith,” Noble argued, “are neither complements nor opposites, nor do they represent succeeding stages of human development. They are merged, and always have been, the technological enterprise being, at the same time, an essentially religious endeavor.”

“This is not meant in a merely metaphorical sense,” he continued, “to suggest that technology is similar to religion in that it evokes religious emotions of omnipotence, devotion, and awe, or that it has become a new (secular) religion in and of itself, with its own clerical caste, arcane rituals, and articles of faith. Rather it is meant literally and historically, to indicate that modern technology and religion have evolved together and that, as a result, the technological enterprise has been and remains suffused with religious belief.”

I have some reservations about certain details in Noble’s work, but the general thesis is sound so far as I can judge.

Finally, of course, I was reminded of the well-known passage from Henry Adams, who, in the third person, recounts his impressions of the dynamos assembled in the Palace of Electricity at the 1900 Exposition Universelle.

To Adams the dynamo became a symbol of infinity. As he grew accustomed to the great gallery of machines, he began to feel the forty-foot dynamos as a moral force, much as early Christians felt the Cross. The planet itself seemed less impressive, in its old-fashioned, deliberate, annual or daily revolution, than this huge wheel, revolving within arm’s-length at some vertiginous speed, and barely murmuring — scarcely humming an audible warning to stand a hair’s-breath further for respect of power — while it would not wake the baby lying close against its frame. Before the end, one began to pray to it; inherited instinct taught the natural expression of man before silent and infinite force. Among the thousand symbols of ultimate energy, the dynamo was not so human as some, but it was the most expressive.

One should note, though, that data centers tend to be a bit louder than the dynamos Adams describes, although apparently some think it makes for good white noise.


If you’ve appreciated what you’ve read consider supporting or tipping the writer.

Winners, Losers, and One-Eyed Prophets

The ur-text of media criticism is the section in Plato’s Pheadrus where Socrates tells the legend of Thamus, an Egyptian king, and Theuth, the god who invented writing and presented it as a gift to Thamus. In the story, Thamus surprises Theuth by failing to joyfully embrace the gift of writing. Here is a portion of that exchange:

But when it came to writing, Theuth declared, “Here is an accomplishment, my lord the King, which will improve both the wisdom and the memory of the Egyptians. I have discovered a sure receipt for memory and wisdom.” To this, Thamus replied, “Theuth, my paragon of inventors, the discoverer of an art is not the best judge of the good or harm which will accrue to those who practice it. So it is in this; you, who are the father of writing, have out of fondness for your off-spring attributed to it quite the opposite of its real function. Those who acquire it will cease to exercise their memory and become forgetful; they will rely on writing to bring things to their remembrance by external signs instead of by their own internal resources. What you have discovered is a receipt for recollection, not for memory. And as for wisdom, your pupils will have the reputation for it without the reality: they will receive a quantity of information without proper instruction, and in consequence be thought very knowledgeable when they are for the most part quite ignorant. And because they are filled with the conceit of wisdom instead of real wisdom they will be a burden to society.

The story of Thamus and Theuth is discussed at length by Neil Postman in the opening chapter of his 1993 book, Technopoly: The Surrender of Culture to Technology. While Postman is ordinarily thought of as the sort of thinker who would be inclined to agree with Thamus, he first points out that Thamus has erred. “The error is not in his claim that writing will damage memory and create false wisdom,” Postman explains. “It is demonstrable that writing has had such an effect. Thamus’ error is in his believing that writing will be a burden to society and nothing but a burden. For all his wisdom, he fails to imagine what writing’s benefits might be, which, as we know, have been considerable.” Every technology, he adds, is both a blessing and a burden.  This much should be obvious, but it is not:

“[W]e are currently surrounded by throngs of zealous Theuths, one-eyed prophets who see only what new technologies can do and are incapable of imagining what they will undo … They gaze on technology as a lover does on his beloved, seeing it as without blemish and entertaining no apprehension for the future.”

Postman grants that there are one-eyed skeptics, too. They see only the new burdens of technology and fail to reckon with the blessings. The point, of course, is to see with both eyes wide open.

Further on in the opening chapter, Postman wrote of another principle to be learned from the judgment of Thamus:

“[N]ew technologies compete with old ones—for time, for attention, for money, for prestige, but mostly for dominance of their world-view. This competition is implicit once we acknowledge that a medium contains an ideological bias. And it is a fierce competition, as only ideological competitions can be. It is not merely a matter of tool against tool—the alphabet attacking ideographic writing, the printing press attacking the illuminated manuscript, the photograph attacking the art of painting, television attacking the printed word. When media make war against each other, it is a case of world-views in collision.”

Finally, here is Postman’s discussion of how there are always winners and losers in the game of what would later be called technological disruption. It is instructive to read this and remember that Postman is writing before the birth of the commercial internet and before digital media has come into its own. Some of it will sound a bit dated, but much of it resonates nonetheless.

“We have a similar situation in the development and spread of computer technology, for here too there are winners and losers. There can be no disputing that the computer has increased the power of large-scale organizations like the armed forces, or airline companies or banks or tax-collecting agencies. And it is equally clear that the computer is now indispensable to high level researchers in physics and other natural sciences. But to what extent has computer technology been an advantage to the masses of people? To steelworkers, vegetable-store owners, teachers, garage mechanics, musicians, bricklayers, dentists, and most of the rest into whose lives the computer now intrudes? Their private matters have been made more accessible to powerful institutions. They are more easily tracked and controlled; are subjected to more examinations; are increasingly mystified by the decisions made about them; are often reduced to mere numerical objects. They are inundated by junk mail. They are easy targets for advertising agencies and political organizations. The schools teach their children to operate computerized systems instead of teaching things that are more valuable to children. In a word, almost nothing that they need happens to the losers.”

I recommend Technopoly to you. It’s 25 years old, but holds up pretty well in my view as an accessible, well-written primer on thinking critically about technology.

Final word to Postman:

“New technologies alter the structure of our interests: the things we think about. They alter the character of our symbols: the things we think with. And they alter the nature of community: the arena in which thoughts develop. As Thamus spoke to Innis across the centuries, it is essential that we listen to their conversation, join in it, revitalize it. For something has happened in America that is strange and dangerous, and there is only a dull and even stupid awareness of what it is—in part because it has no name. I call it Technopoly.”