Superfluous People, the Ideology of Silicon Valley, and The Origins of Totalitarianism

There’s a passage from Arendt’s The Origins of Totalitarianism that has been cited frequently in recent months, and with good reason. It speaks to the idea that we are experiencing an epistemic crisis with disastrous cultural and political consequences:

The ideal subject of totalitarian rule is not the convinced Nazi or the convinced Communist, but people for whom the distinction between fact and fiction (i.e., the reality of experience) and the distinction between true and false (i.e., the standards of thought) no longer exist.

Jay Rosen recently tweeted that this was, for him, the quote of the year for 2017, and one can see why.

I would, however, suggest that there is another passage from the closing chapters of The Origins of Totalitarianism, or rather cluster of passages, that we might also consider. These passages speak to a different danger: the creation of superfluous people.

“There is only one thing,” Arendt concludes, “that seems discernible: we may say that radical evil has emerged in connection with a system in which all men have become equally superfluous.”

“Totalitarianism strives not toward despotic rule over men,” Arendt furthermore claims, “but toward a system in which men are superfluous.” She immediately adds, “Total power can be achieved and safeguarded only in a world of conditioned reflexes, of marionettes without the slightest trace of spontaneity.”

Superfluity, as Arendt uses the term, suggests some combination of thoughtless automatism, interchangeability, and expendability. A person is superfluous when they operate within a system in a completely predictable way and can, as a consequence, be easily replaced. Individuality is worse than meaningless in this context; it is a threat to the system and must be eradicated.

So just as the “ideal subject” of a totalitarian state is someone who has been overwhelmed by epistemic nihilism, Arendt describes the “model ‘citizen'” as the human person bereft of spontaneity: “Pavlov’s dog, the human specimen reduced to the most elementary reactions, the bundle of reactions that can always be liquidated and replaced by other bundles of reactions that behave in exactly the same way, is the model ‘citizen’ of a totalitarian state.”

Arendt adds that “such a citizen can be produced only imperfectly outside of the camps.” In the camps, the “world of the dying” as Arendt calls them, “men are taught they are superfluous through a way of life in which punishment is meted out without connection to crime, in which exploitation is practiced without profit, and where work is performed without product, is a place where senselessness is daily produced anew.”

It may be obvious how Arendt’s claim regarding the inability to distinguish between truth and falsehood, fact and fiction speaks to our present moment, but what does her discussion of superfluous people and concentration camps have to do with us?

First, I should make clear that I do not expect to see death camps anytime soon. That said, it seems that there are a number of developments which together tend toward rendering people superfluous. For example: the operant conditioning to which we submit on social media, the pursuit of ever more sophisticated forms of automation, and the drive to outsource more and more aspects of our humanity to digital tools.

“If we take totalitarian aspirations seriously and refuse to be misled by the common-sense assertion that they are utopian and unrealizable,” Arendt insisted, “it develops that the society of the dying established in the camps is the only form of society in which it is possible to dominate man entirely.”

I would suggest that having discovered another form of society in which it is possible to dominate people entirely may be the dark genius of our age, a Huxleyan spin on an earlier Orwellian threat. I would also suggest that this achievement has traded on the expression of individuality rather than its suppression.

For example, social media appears to encourage the expression of individuality. In reality, it is a Skinner box, we are being programmed, and our so-called individuality is irrelevant ephemera so far as the network is concerned. In other words, people, insofar as they are considered as individuals, are, in fact, superfluous.

Regarding automation, it is, from my vantage point and given my lack of expertise, impossible to tell what will be the scale of its impact on employment. But it seems clear that there is cause for concern (unless you happen to live in Sweden). I have no reason to doubt that what jobs can be automated will be automated at the expense of workers, workers who will be rendered superfluous. What new jobs are expected to arise will be of the micro-gig economy or tend-the-machine sort. Work, that is to say, in which people qua individuals are superfluous.

As for the outsourcing of our cognitive, emotional, and ethical labor and our obsessive self-tracking and self-monitoring, it amounts to being sealed in a tomb of our revealed preferences (to borrow Rob Horning’s memorable line). Once more, spontaneous desire, serendipity, much of what Arendt classified as natality, the capacity to make a beginning at the heart of our individuality—all of it is surrendered to the urge for an equilibrium of programmed predictability.

___________________________________

“Over and above the senselessness of totalitarian society,” Arendt went on to observe, “is enthroned the ridiculous supersense of its ideological superstition.” As she goes on to analyze the ideologies that supported the senselessness of totalitarian societies, discomforting similarities to strands of the Silicon Valley ideology emerge. Most notably, it seems to me, they share a blind adherence to a supposed Law driving human affairs. A Law adherence to which frees a person from ordinary moral responsibility, raises the person above the unenlightened masses, indeed, generates a barely veiled misanthropy.

Consider the following analysis:

Totalitarian lawfulness, defying legality and pretending to establish the direct reign of justice on earth executes the law of History [as understood by Communism] or of Nature [as understood by Nazism] without translating it into standards of right and wrong for individual behavior. It applies the law directly to mankind without bothering with the behavior of men. The law of Nature or the law of History, if properly executed, is expected to produce mankind as its end product; and this expectation lies behind the claim to global rule of all totalitarian governments.

Now substitute the “law of Technology” for the law of History and the law of Nature. Tell me if it does not work just as well. This law can be variously framed, but it amounts to some kind of self-serving, poorly conceived technological determinism built upon some ostensible fact like Moore’s Law, and it dictates that humanity as it exists must be left behind in order to accommodate this deep law ordering the flow of time.

“What totalitarian ideologies therefore aim at is not the transformation of the outside world or the revolutionizing of society, but the transformation of human nature itself,” Arendt recognized. And so it is with the transhumanist strains of the ideology of Silicon Valley.

As I write these words, an excerpt from Emily Chang’s forthcoming Brotopia: Breaking Up the Boys Club of Silicon Valley published in Vanity Fair is being shared widely on social media. It examines the “exclusive, drug-fueled, sex-laced parties” that some of the most powerful men in Silicon Valley regularly attend. The scandal is not in the sexual license. Indeed, that they believe their behavior to be somehow bravely unconventional and pioneering would be laughable were it not for its human toll. What is actually disturbing is how this behavior is an outworking of ideology and how this ideology generates so much more than drug-addled parties.

“[T]hey speak proudly about how they’re overturning traditions and paradigms in their private lives, just as they do in the technology world they rule,” Chang writes. “Their behavior at these high-end parties is an extension of the progressiveness and open-mindedness—the audacity, if you will—that make founders think they can change the world. And they believe that their entitlement to disrupt doesn’t stop at technology; it extends to society as well.”

“If this were just confined to personal lives it would be one thing,” Chang acknowledges. “But what happens at these sex parties—and in open relationships—unfortunately, doesn’t stay there. The freewheeling sex lives pursued by men in tech—from the elite down to the rank and file—have consequences for how business gets done in Silicon Valley.”

“When they look in the mirror,” Chang concludes, “they see individuals setting a new paradigm of behavior by pushing the boundaries of social mores and values.”

If you’re on the vanguard of the new humanity, social mores and values are for losers.

Arendt also gives us a useful way of framing  the obsession with disruption.

“In the interpretation of totalitarianism, all laws have become laws of movement,” Arendt claims. That is to say that stability is the enemy of the execution of the law of History or of Nature or, I would add, of Technology: “Neither nature nor history is any longer the stabilizing source of authority for the actions of mortal men; they are movements themselves.”

Upsetting social norms, disrupting institutions, destabilizing legal conventions, all of it is a way of freeing up the inevitable unfolding of the law of Technology. Never mind that what is actually being freed up, of course, is the movement of wealth. The point is that the ideology gives cover for whatever depredations are executed in its name. It engenders, as Arendt argues elsewhere, a pernicious species of thoughtlessness that abets all manner of moral outrages.

“Terror,” she explained, “is the realization of the law of movement; its chief aim is to make it possible for the force of nature or of history to race freely through mankind, unhindered by any spontaneous human action. As such, terror seeks to ‘stabilize’ men in order to liberate the forces of nature or history.”

Here again I would argue that we are witnessing a Huxleyan variant of this earlier Orwellian dynamic. Consider once more the cumulative effect of the many manifestations of the networks of surveillance, monitoring, operant conditioning, automation, routinization, and programmed predictability in which we are enmeshed. Their effect is not enhanced freedom, individuality, spontaneity, thoughtfulness, or joy. Their effect is, in fact, to stabilize us into routine and predictable patterns of behavior and consumption. Humanity is stabilized so that the law of Technology can run its course.

Under these circumstances, Arendt goes on to add, “Guilt or innocence become senseless notions; ‘guilty’ is he who stands in the way of the natural or historical process which has passed judgment over ‘inferior races,’ over ‘individuals ‘unfit to live,’ over ‘dying classes and decadent peoples.'” All the recent calls for reform of the tech industry, then, may very well fall not necessarily on deaf ears but on uncomprehending or indifferent ears tuned only to greater ideological “truths.”

___________________________________

“Totalitarian solutions may well survive the fall of totalitarian regimes in the form of strong temptations which will come up whenever it seems impossible to alleviate political, social, or economic misery in a manner worthy of man.”

Perhaps that, too, is an apt passage for our times.

In two other observations Arendt makes in the closing pages of Origins, we may gather enough light to hold off the darkness. She writes of loneliness as the “common ground for terror, the essence of totalitarian government, and for ideology and logicality, the preparation of its executioners and victims.” This loneliness is “closely connected with uprootedness and superfluousness which have been the curse of the modern masses since the beginning of the industrial revolution [….]

“Ideologies are never interested in the miracle of being,” she also observes.

Perhaps, then, we might think of the cultivation of wonder and friendship as inoculating measures, a way of sustaining the light.

The Enchanted World We Might Learn to See

In The Enchantment of Modern Life, Jane Bennett challenges the received wisdom regarding the disenchantment of modernity. She questions “whether the very characterization of the world as disenchanted ignores and then discourages affective attachment to the world.” “The question is important,” she adds, “because the mood of enchantment may be valuable for ethical life.”

I’m reading Bennett as part of my ongoing interest in the story we tell about disenchanted modernity and my hunch that we are, in fact, not so much disenchanted as differently enchanted: technologically enchanted.

Bennett believes that “the contemporary world retains the power to enchant humans and that humans can cultivate themselves so as to experience more of that effect.” “To be enchanted,” she suggests, “is to be struck and shaken by the extraordinary that lives amid the familiar and everyday.” She also relates enchantment to “moments of joy,” a joy that can “propel ethics.”

Bennet goes on to explain that enchantment, in her view, “entails a state of wonder, and one of the distinctions of this state is the temporary suspension of chronological time and bodily movement.” She further describes this experience by likening it to what Philip Fisher, in Wonder, the Rainbow, and the Aesthetics of Rare Experiences, called moments of “pure presence.”

“The moment of pure presence within wonder,” Fisher wrote,

“lies in the object’s difference and uniqueness being so striking to the mind that it does not remind us of anything and we find ourselves delaying in its presence for a time in which the mind does not move on by association to something else.”

Thoughts and body are “brought to rest,” Bennett elaborates,

even as the sense continue to operate, indeed, in high gear. You notice new colors, discern details previously ignored, hear extraordinary sounds, as familiar landscapes of sense sharpen and intensify. The world comes alive as a collection of singularities. Enchantment includes, then, a condition of exhilaration or acute sensory activity. To be simultaneously transfixed in wonder and transported by sense, to be both caught up and carried away—enchantment is marked by this odd combination of somatic effects.

I’m not yet sure what to make of Bennett’s overall thesis and I’m not sure how it will relate to the questions in which I’m most interested, but I found this early discussion of enchantment/wonder poignant.

I do believe the world has something to offer us. How we understand that something is, of course, a contentious matter, but let us assume for a moment that the world offers something of value if only we are able to properly attend to it. The problem, it seems to me, is that we do not, in fact, ordinarily attend to the world very well.

There are certainly a variety of reasons for this state of affairs. Among Bennett’s more intriguing propositions is that buying into disenchantment talk becomes something like a self-fulfilling prophecy. This seems plausible enough. If we are talking about a peculiar kind of seeing (or hearing, etc.) and if this seeing requires a peculiar kind of attentiveness, then it makes sense that we wouldn’t bother with the attentiveness if we didn’t think there was anything to see.

I’ve suggested before that angst about digital distraction will not amount to much if we don’t also consider what, in fact, we ought to direct our attention toward. We should not, however, think about attention merely as a faculty that we discipline so that we might purposefully direct it. We do not, after all, always know what it is that we should be looking for. Somehow, then, attention must involve not only purposeful directedness, but also a purposeful openness or receptivity. In truth, it’s a matter of becoming a certain kind of person, and, as Bennett hopefully suggests, it may be possible to “cultivate” ourselves in order to do so.

Not surprisingly, I’m less than sanguine about how digital tools tend to enter into this work. It is abundantly clear that the devices, services, platforms, and apps that structure so much of our experience are more likely to erode the sort of attentiveness that Bennett and Fisher have in mind than they are to sustain and encourage it. In fact, it is increasingly clear that they were consciously designed to divide and conquer our attention with consequences that spill out into the whole of our experience.

“Enchantment is something that we encounter, that hits us,” Bennett writes, “but it is also a comportment that can be fostered through deliberate strategies.” Among those strategies, Bennett mentions three: (1) giving greater expression to the sense of play, (2) honing sensory receptivity to the marvelous specificity of things, and (3) resisting the story of the disenchantment of modernity.

We would do well to add a fourth: recovering the virtue of temperance, particularly with regard to our use of digital media.

Whether or not we speak of it as enchantment, the world before us, though it often appears cruel and bleak, nonetheless offers beauty, wonder, and joy to those with eyes to see and ears to hear. Among all that we might resolve to do and to be in the year ahead, it seems to me that we could do far worse than resolve to be better stewards of our attention, a precious resource that, well-tended, can yield sometimes modest, sometimes deeply meaningful rewards.

The Surveilled Student

Here are but two paragraph from one installment of Audrey Watters’s year-end review of the top education technology stories.

“Helicopter parenting” – or at least parental anxiety – might not be a new phenomenon, but it is now increasingly able to enlist new technologies to monitor children’s activities. A story this summer in The New York Magazine is, no doubt, an extreme example of this: “Armed with Nest Cams and 24/7 surveillance, one company promises to fix even the most dysfunctional child – for a price.” But many, many technology products boast features that allow parents to check up on what their kids are doing – what they’re reading, what they’re watching, what they’re clicking onwhat they’re saying, who they’re talking tohow many steps they’re taking, where they’re at at any given money, and so on. It’s all under the auspices, of course, of keeping kids safe.

This all dovetails quite neatly, as I noted in the article on education data, with the ways in which schools too are quite willing to surveil students. The New York Times family section cautioned in August about “The Downside of Checking Kids’ Grades Constantly.” But the expectation of many ed-tech products (and increasingly school policy) is that parents will do just this – participate in the constant monitoring of student data.

I pass it along to you for a couple of reasons. First, this paragraph, and the links it contains, touch on an important dimension of what it means to be a parent in the digital age, a topic I’ve been thinking a lot about over the last few months.

I also pass it along to you in the event that you’ve not yet come across Audrey Watters’s work. She is a terrific guide to all things related to education and technology. If you are at all interested in how technology is brought to bear on the task of educating our children, and really all of us should be at least somewhat interested, then you would do well to keep up with Watters. You should definitely read the Education Technology and the New Behaviorism.

The Shape of Our Tools, The Shape of Our Souls

When, a few weeks ago, I suggested that the dystopia is already here, it’s just not evenly distributed, I cited a story about the strange and disturbing world of children’s YouTube. Today, you can read more about YouTubers who made hundreds of thousands of dollars creating and posting these videos until YouTube started shutting them down.

There’s one line in this story to which I’ll draw your attention. One prominent “content creator” described his method this way:  “We learned to fuel it and do whatever it took to please the algorithm.”

I submit to you that this line is as good a slogan for our emerging social reality as any you’re likely to find: do whatever it takes to please the algorithm.

It reminded me of the bracingly honest and cheery slogan associated with the 1933 World’s Fair in Chicago: “Science Finds, Industry Applies, Man Conforms.” Or Carlyle’s complaint that people were becoming “mechanical in head and heart.”

It is true, of course, that we will bend to the shape of any number of external realities: natural, social, and technological.* To be human is to both shape and be shaped by the world we inhabit.

But what is the shape to which our tools and devices encourage us to conform?

Who or what do we seek to please by the way we use them?

Do they sustain our humanity or erode it?

These are the questions we do well to ask.


* It is also true that the boundaries between those categories are blurry.

A Note for Readers

A few months ago, I set up an account with Patreon, a platform that allows supporters to make monthly pledges to writers, artists, etc. I did so as part of my ongoing effort to figure out a way of making my way as what one might call an independent scholar. I wasn’t exactly making a killing with Patreon, but a few of you generous souls have seen fit over the last few months to pay the writer. I’ve been and remain deeply appreciative.

It turns out that recent changes to Patreon’s fee structure have made giving small amounts rather onerous for donors, creating quite the backlash from users. Given Patreon’s changes, and emails from readers about a few related frustrations with the service, I’ve decided to set up an alternative. Alongside the Patreon link, you’ll now find a link to a Paypal.me page. Unfortunately, this does not provide a way to set up recurring donations, but it is pretty easy to use and they take less from both of us, as far as I can make out. It also makes it easy to make a one time donation if, for example, you found a particular post especially helpful or brilliant or life-changing, etc.

For now, I’ll be leaving the Patreon account open. If you’re one of the happy few who have donated through Patreon, please do feel free to suspend your donations if you find it a poor use of your resources.

Also, I’m always grateful for feedback. I sometimes envision my work here as an attempt to occupy a space somewhere between academic and popular writing on technology: a little more accessible than the former and offering a bit more depth than the latter. Let me know how I’m doing. I’m curious, too, about which sorts of posts readers find useful or what you may want to see more of (or less of). You can find my email address on the About page.

Cheers!