Fraying Into the Future

Whatever else I might say of 2018, I can at least claim to have written more. In the end-of-the-year-ICYMI spirit, here’s a quick recap.

At The New Atlantis, I published two pieces: a critical consideration of the so-called tech backlash and a review of Siva Viadhyanathan’s Antisocial Media. At Real Life, I wrote about personal panopticons. I also reviewed Cardinal Sarah’s The Power of Silence.

I managed, as well, to keep up the The Convivial Society at a mostly monthly clip.

And on this blog, which will be entering its ninth calendar year, I’ve posted more than any year since 2014.

Most read posts included The World Will Be Our Skinner Box, Technology After the Great War, and Attention and Memory in the Age of the Disciplinary Spectacle, which all got a Hacker News front page bump.

Other notables included Superfluous People, the Ideology of Silicon Valley, and The Origins of Totalitarianism; Social Media and Loneliness; Cyborg Discourse Is Useless; and Why We Can’t Have Humane Technology.

Looking forward to maintaining that momentum heading into the new year. Thanks for reading. My apologies if you’ve commented and not received a reply, I’ll do better next year … maybe. As always, your support is welcomed and appreciated; I’m currently living off my words.

Best wishes to you and yours in this new year.

Here’s one of my favorite poems and poets, “Year’s End” by Richard Wilbur, as we brave the new year together.

Now winter downs the dying of the year,
And night is all a settlement of snow;
From the soft street the rooms of houses show
A gathered light, a shapen atmosphere,
Like frozen-over lakes whose ice is thin
And still allows some stirring down within.

I’ve known the wind by water banks to shake
The late leaves down, which frozen where they fell
And held in ice as dancers in a spell
Fluttered all winter long into a lake;
Graved on the dark in gestures of descent,
They seemed their own most perfect monument.

There was perfection in the death of ferns
Which laid their fragile cheeks against the stone
A million years. Great mammoths overthrown
Composedly have made their long sojourns,
Like palaces of patience, in the gray
And changeless lands of ice. And at Pompeii

The little dog lay curled and did not rise
But slept the deeper as the ashes rose
And found the people incomplete, and froze
The random hands, the loose unready eyes
Of men expecting yet another sun
To do the shapely thing they had not done.

These sudden ends of time must give us pause.
We fray into the future, rarely wrought
Save in the tapestries of afterthought.
More time, more time. Barrages of applause
Come muffled from a buried radio.
The New-year bells are wrangling with the snow.

The Losing Game of Time

The morning after Christmas, I raised a cup of coffee, turned to my wife, and said, “Congratulations, we survived Christmas.” It was a light-hearted, but not entirely factitious declaration. It did feel like “surviving,” which is not at all what it should feel like, of course. As the words were leaving my mouth, I realized that they regrettably contained more truth than they ought to. (I mean this in a strictly “first-world” sense, of course. There are too many people for whom life as survival is an all-too-real and deadly reality, and it would be obscene to compare my situation to theirs.)

But what’s to be done? It’s not just “the holidays,” rather it seems to be the nature of life as it has come to be for many of us. The problem may feel especially pointed around this time of year because, in one way or another, the contrast between how we believe the season ought to feel and how it actually does is stark to the point of despair if one dwells on it for very long.

I realize it’s a long-standing Christmas tradition to decry the commercialization of the holiday or to bemoan how it has become a consumerist wasteland, etc., etc. For example, a week or two ago (who can sort the days and data out anymore?) a story about the joys of a gift-less Christmas was making the rounds. I caught the author on NPR around the same time, too. This year’s leading entry in the genre, I suppose. Which is fine, but it somehow misses the point.

The question—what is to be done?—maybe that is the problem, or at least it seemed so to me just then as I continued to think about why it must always feel like survival. Without implying that the line is hard and fixed between the two, it may be better to ask “How are we to be?” rather than “What are we to do?” The latter implies a program of action, a method for greater efficiency or productivity, yet another layer of technique and management, a continuing effort to, in one memorable rendering of the language of Ecclesiastes, sculpt the mist: to double-down, it seems, on the very things that have played no small role in generating the situation we’re trying somehow to overcome. At the very least, it seems that we should be able to ask both questions.

There are, undoubtedly, what we think of as structural factors—economic and social and, yes, technological—near the root of our harried, just-in-time way of life. And these structural factors come to a point in each of us; our orientation toward our own experience of the world is calibrated by the combined pressures generated by these structural conditions.

The obvious but obviously difficult thing to do is to somehow reorder these out of whack structures, but, this cannot finally happen without there also being some fundamental re-ordering of our own orientation to the world, perhaps especially our experience of time in the world.

Essentially this is a story about chronopolitics, the imperatives, conditions, and powers that structure the experience of time for societies and individuals. There are any number of ways to think about this. Among the more obvious is to consider how power and wealth tend to yield greater autonomy over our experience of time. But I’m interested not only in how we allot our time, thinking of time as a resource, but also about our internal experience of time, which I’ve described as “the speed at which we feel  ourselves moving across the temporal dimension.”

It seems now that it might be better to put it this way: the speed at which we feel ourselves wanting to move across the temporal dimension. What generates this inner sense that we must rush, speed, dash, careen onward through our days, weeks, months, years? What makes it seem as if we are only just “surviving” each day? Perhaps it is some prior imperative to master time, a game at which we can only ever lose. And one that was both suggested and sustained by tools to measure, divide, save, freeze, and otherwise control time. As Lewis Mumford famously observed, “the clock, not the steam-engine, is the key machine of the modern industrial age.”

In his study of place, philosopher Edward Casey first considered the triumph of time in the modern world: “‘Time will tell’: so we say, and so we believe, in the modern era. This era, extending from Galileo and Descartes to Husserl and Heidegger, belongs to time, the time when Time came into its own.”

“Scheduled and overscheduled,” he added, “we look to the clock or the calendar for guidance and solace, even judgment! But such time-telling offers precious little guidance, no solace whatsoever, and a predominantly negative judgment (‘it’s too late now’) … We are lost because our conviction that time, not only the world’s time but our time, the only time we have, is always running down.”

More from Casey:

“The pandemic obsession with time from which so many moderns have suffered — and from which so many postmoderns still suffer — is exacerbated by the vertiginous sense that time and the world-order, together constituting the terra firma of modernist solidity, are subject to dissolution. Not surprisingly, we objectify time and pay handsome rewards … to those who can tie time down in improved chronometry. Although, the modern period has succeeded brilliantly in this very regard, it has also fallen into the schizoid state of having made objective, as clock-time and world-time, what is in fact most diaphanous and ephemeral, most ‘obscure’ in human experience. We end by obsessing about what is no object at all. We feel obligated to tell time in an objective manner; but in fact we have only obliged ourselves to do so by our own sub rosa subreptions, becoming thereby our own pawns in the losing game of time.”


If you’ve appreciated what you’ve read, consider supporting or tipping the writer.

Just Delete It

Facebook’s had an awful year. Frankly, it would be tedious to recount the details. If you are so inclined, you can read Wired’s summary of “the 21 (and counting) biggest Facebook Scandals of 2018.”

The latest round of bad news for Facebook came when the Times reported that the platform granted over 150 companies special access to user data, apparently circumventing stated privacy guidelines and giving the lie once again to their purported desire to give users complete control over their data.

I’m not sure whether these stories even register anymore with those outside the tech journalism/tech criticism/privacy scholarship world. I do see, however, through my small Twitter-sized window into that world, that this story has occasioned another round of debate about the efficacy of deleting Facebook.

More than one tech journalist has re-tweeted a link to Siva Viadhyanathan’s Times op-ed from March of this year urging us not to delete Facebook but, rather, to do something about it. To which claim my inner monologue immediately rejoins, quitting is doing something. Moreover, how difficult is it to imagine quitting Facebook and doing something about it in the way that these kinds of arguments suggest. You know, quit and call your representative or whatever.

The underlying idea here seems to be that there is no future world where Facebook doesn’t exist. We must stay on because leaving is a luxury and we would be abandoning all of those who do not have such a luxury. This assumes that the platform can sustain good faith efforts to speak and defend the truth, etc. That seems, at best, wildly optimistic.

Alternatively, the assumption may also be that what Facebook provides is, generally speaking, a good thing, but, unfortunately, this “good” service is provided by a “bad” company. The point, then, is to somehow preserve the service while making the company better by means to which companies tend to be more responsive (regulation, law suits, etc.).

I’ve written a bit about Facebook over the years—too much, frankly. At just this moment, I’m rather annoyed that I’ve given Facebook as much attention as I have. This summer I wrote a review of Viadhyanathan’s fine book on Facebook for The New Atlantis, and I commented here on the #DeleteFacebook debate from earlier this year.

In the review, I argued that we should consider the possibility that the service Facebook provides, even if it were delivered by a “good” company, would still be an individually and politically debilitating reality. In the post, I tried to make the case that whatever other action is taken with regards to Facebook, the choice of some to delete their accounts should not be derided or discouraged.

Maybe the lesson we are to take from the last two years is not simply that surveillance capitalism is bad news but also that the kind of ubiquitous connectivity upon which it is built is also bad news. This, it seems, is somehow unthinkable to us. To some damning degree, we seem to agree with Zuckerberg’s ideology of connection, most stridently articulated in the infamous Bosworth memo (also this year!). We’ve bought into the idea that digitally connecting people is somehow an unalloyed and innocent good.

This recalls Alan Jacobs’ point from a couple of years back: “So there is a relationship between distraction and addiction, but we are not addicted to devices. As Brooke’s Snapchat story demonstrates, we are addicted to one another, to the affirmation of our value—our very being—that comes from other human beings. We are addicted to being validated by our peers.” (Although, as we now understand more clearly, the design of the devices and platforms is not irrelevant either.)

I would also suggest that the very stickiness of the platform, the very way it leads us generate nuanced and dubious arguments as a rationale for remaining, this by itself should impel us to cut our ties.

So, look, just delete it. At the very least, let’s not give anyone grief for doing so. They are doing something. And I’m not convinced that what they’re doing isn’t, in fact, the most efficacious action we can take. They are willing to believe what we should all consider more seriously, that we can make do just fine in a world without Facebook.


If you’ve appreciated what you’ve read, consider supporting (Patreon) or tipping (Paypal) the writer.

A Withering Light

I read a tweet the other day from a writer, a very good writer, introducing a thread with links to their work throughout the year. The tweet was all irony and self-deprecation, and I thought to myself this is how the world ends.

More precisely, this is how a world keeps from coming into being. I don’t know when or where I first encountered the line from Gramsci about how the old is dying and the new cannot be born, but it has remained with me ever since as one those fragments that somehow manages to illuminate the present. I think of it often.

The old is dying. The new cannot be born. The morbid symptoms are all around us. The new cannot be born because there is no darkness within which it can take shape and grow. As Arendt observed, “Everything that lives, not vegetative life alone, emerges from darkness and, however strong its natural tendency to thrust itself into the light, it nevertheless needs the security of darkness to grow at all.”

There is no darkness within which a world can come into being. All is bathed in the light of reflexive self-consciousness. “If you tell people how they can sublimate,” the psychologist Harry Stack Sullivan once observed, “they can’t sublimate.” “Life needs the protection of nonawareness,” Guardini writes, “There is no path to take us outside of ourselves.” “Our action is constantly interrupted by reflection on it,” he adds. “Thus all our life bears the distinctive character of what is interrupted, broken. It does not have the great line that is sure of itself, the confident movement deriving from the self.”

But of what are we conscious when our self-consciousness is mediated by the internet? The knowledge we gain is not knowledge of the self, which has always been elusive. “I have become a question to myself,” St. Augustine declares. It is rather knowledge of the self playing itself to an audience of indeterminate size and presence. Or, to put it another way, we do not become aware of ourselves, we become aware of the relation of the self to itself. Our piety does not consist, as Socrates and Euthyphro contemplated, of a therapy of the gods but rather of a therapy of the self, and this piety is sustained by some of our most sophisticated tools.

Paradoxically, the self is, in some important respect, a gift of the other before whom it appears; it is received rather than made. We come to know ourselves as we become aware of who we are not through our attentiveness to others. Moreover, insofar as the self is constituted by its desires, it is, like our desires, an intersubjective reality. When the other before whom we appear is the amorphous audience mediated to us by tools designed to induce the self into its own self-management, then the self is positioned in such a way that it cannot receive itself as gift.

The trajectory is long—writing heightens consciousness Ong observed—but the internet has brought us through a critical threshold. It has been, among much else, a machine for the generation of obsessive self-documentation and self-awareness at both a global and personal scale. It brings a withering light to bear upon our inner lives and suffuses all our public acts with doubt, half-heartedness, and hesitation. How can it be otherwise, the self cannot sustain itself. Or else it fuels our public acts with a petty or violent rage as if the only way we could prove to ourselves that we are real is by the measure of the suffering we inflict upon ourselves or others (this is the lesson of the Underground Man). Or, as Yeats put it, “the best lack all conviction while the worst are full of passionate intensity.” He speaks as well, interestingly enough, of belated birth, of the rough beast slouching toward Bethlehem to be born.


If you’ve appreciated what you’ve read, consider supporting (Patreon) or tipping (Paypal) the writer.

Digital Media and Our Experience of Time

Early in the life of this site, which is to say about eight years ago, I commented briefly on a story about five scientists who embarked on a rafting trip down the San Juan River in southern Utah in an effort to understand how “heavy use of digital devices” was affecting us. The trip was the brainchild of a professor of psychology at the University of Utah, David Strayer, and Matt Ritchel wrote about it for the Times.

I remember this story chiefly because it introduced me to what Strayer called “third day syndrome,” the tendency, after about three days of being “unplugged,” to find oneself more at ease, more calm, more focused, and more rested. Over the past few years, Strayer’s observation has been reinforced by my own experience and by similar unsolicited accounts I’ve heard from several others.

As I noted back in 2010, the rafting trip led one of the scientists to wonder out loud: “If we can find out that people are walking around fatigued and not realizing their cognitive potential … What can we do to get us back to our full potential?”

I’m sure the idea that we are walking around fatigued will strike most as entirely plausible. That we’re not realizing our full cognitive potential, well, yes, that resonates pretty well, too. But, that’s not what mostly concerns me at the moment.

What mostly concerns me has more to do with what I’d call the internalized pace at which we experience the world. I’m not sure that’s the most elegant formulation for what I’m trying to get at. I have in mind something like our inner experience of time, but that’s not exactly right either. It’s more like the speed at which we feel ourselves moving across the temporal dimension.

Perhaps the best metaphor I can think of is that of walking on those long motorized walkways we might find at an airport, for example. If you’re not in a hurry, maybe you stand while the walkway carries you along. If you are in a hurry or don’t like standing, you walk, briskly perhaps. Then you step off at the end of the walkway and find yourself awkwardly hurtling forward for a few steps before you resume a more standard gait.

So that’s our experience of time, no? Within ordinary time, modernity has built the runways, and we jump on and hurry ourselves along. Then, for whatever reason, we get dumped out into ordinary time and our first experience is to find ourselves disoriented and somehow still feeling ourselves propelled forward by the some internalized temporal inertia.

I feel this most pronouncedly when I take my two toddlers out to the park, usually in the late afternoons after a day of work. I delight in the time, it is a joy and not at all a chore, yet I frequently find something inside me rushing me along. I’ve entered into ordinary time along with two others who’ve never known any other kind of time, but I’ve been dumped into it after running along the walkway all day, day in and day out, for weeks and months and years. On the worst days, it takes a significant effort of the will to overcome the inner temporal inertia and that only for a few moments at a time.

This state of affairs is not entirely novel. As Harmut Rosa notes in Social Acceleration, Georg Simmel in 1897 had already remarked on how “one often hears about the ‘pace of life,’ that it varies in different historical epochs, in the regions of the contemporary world, even in the same country and among individuals of the same social circle.”

Then I recall, too, Patrick Leigh Fermor’s experience boarding at a monastery in the late 1950s. Fermor relates the experience in A Time to Keep Silence:

“The most remarkable preliminary symptoms were the variations of my need of sleep. After initial spells of insomnia, nightmare and falling asleep by day, I found that my capacity for sleep was becoming more and more remarkable: till the hours I spent in or on my bed vastly outnumbered the hours I spent awake; and my sleep was so profound that I might have been under the influence of some hypnotic drug. For two days, meals and the offices in the church — Mass, Vespers and Compline — were almost my only lucid moments. Then began an extraordinary transformation: this extreme lassitude dwindled to nothing; night shrank to five hours of light, dreamless and perfect sleep, followed by awakenings full of energy and limpid freshness. The explanation is simple enough: the desire for talk, movements and nervous expression that I had transported from Paris found, in this silent place, no response or foil, evoked no single echo; after miserably gesticulating for a while in a vacuum, it languished and finally died for lack of any stimulus or nourishment. Then the tremendous accumulation of tiredness, which must be the common property of all our contemporaries, broke loose and swamped everything. No demands, once I had emerged from that flood of sleep, were made upon my nervous energy: there were no automatic drains, such as conversation at meals, small talk, catching trains, or the hundred anxious trivialities that poison everyday life. Even the major causes of guilt and anxiety had slid away into some distant limbo and not only failed to emerge in the small hours as tormentors but appeared to have lost their dragonish validity.”

I read this, in part, as the description of someone who was re-entering ordinary time, someone whose own internal pacing was being resynchronized with ordinary time.

So I don’t think digital media has created this phenomenon, but I do think it has been a powerful accelerating agent. How one experiences time is complex matter and I’m not Henri Bergson, but one way to account for the experience of time that I’m trying to describe here is to consider the frequency with which we encounter certain kinds of stimuli, the sort of stimuli that assault our attention and redirect it, the kinds of stimuli digital media is most adept at delivering. I suppose the normal English word for what I’ve just awkwardly described is distraction. Having become accustomed to a certain high frequency rate of distraction, our inner temporal drive runs at a commensurate speed. The temporal inertia I’ve been trying to describe, then, may also be characterized as a withdrawal symptom once we’re deprived of the stimuli or their rate dramatically decreases. The total effect is both cognitive and affective: the product of distraction is distractedness but also agitation and anxiety.

Along these same lines, then, we might say that our experience of time is also structured by desire. Of course, we knew this from the time we were children: the more you await something the slower time seems to pass. Deprived of stimuli, we crave it and so grow impatient. We find ourselves in a “frenetic standstill,” to borrow a phrase from Paul Virilio. In this state, we are unable to attend to others or to the world as we should, so the temporal disorder is revealed to have moral as well as cognitive and affective dimensions.

It’s worth mentioning, too, how digital tools theoretically allow us to plan our days and months in fine grain detail but how they have also allowed us to forgo planning. Constant accessibility means that we don’t have to structure our days or weeks ahead of time. We can fiddle with plans right up to the last second, and frequently do so. The fact that the speed of commerce and communication has dramatically increased also means that I have less reason to plan ahead and I am more likely to make just-in-time purchases and just-in-time arrangements. Consequently, our experience of time amounts to the experience of frequently repeated mini-dashes to beat a deadline, and there are so many deadlines.

“From the perspective of everyday life in modern societies,” Harmut Rosa writes, “as everyone knows from their own experience, time appears as fundamentally paradoxical insofar as it is saved in ever greater quantities through the ever more refined deployment of modern technology and organizational planning in almost all everyday practices, while it does not lose its scarce character at all.” Rosa cites two researchers who conclude, “American Society is starving,” not for food, of course, “but for the ultimate scarcity of the postmodern world, time.” “Starving for time,” they add, “does not result in death, bur rather, as ancient Athenian philosophers observed, in never beginning to live.”


If you’ve appreciated what you’ve read, consider supporting (Patreon) or tipping (Paypal) the writer.