7 Ways to Be a Better Human Online

Most of us are familiar with some version of what’s popularly known as the Golden Rule. Here is the formulation attributed to Jesus in Matthew’s Gospel: “So whatever you wish that others would do to you, do also to them” (ESV). However familiar we might be with this moral principle in the abstract, we seem to be unaware of all the particular situations to which it could be applied.

For instance, consider this less than elegant imperative that I offer my students: read and write as you would want to be read and written to. The underlying principles are fairly straightforward: read the work of others with the same generosity of spirit with which you would want your own writing considered. Likewise, write with the same care that you would have others take when they write what they would have you read.

Along similar lines, I’ll attempt to put these principles, and a few others like them, a bit more succinctly and apply them to our online interactions. These principles are chiefly concerned with the internet as a space for public discourse, and the premise in each case is that online we are or ought to be our brother and sister’s keeper. I grant at the outset that this may all be hopelessly naive, and, more importantly, that much more could be said.

1. Read generously.

In A History of Reading, Alberto Manguel wrote, “All writing depends upon the generosity of the reader.” This can hardly be improved upon. The point is not that we should simply accept all that we read. Rather, the point is that understanding should ordinarily precede criticism, and understanding requires a measure of sympathy. If not sympathy, then at least a genuine openness of mind, a willingness to enter into the mental world of the writer. Knowing how much bias we ordinarily bring to the work of understanding, I would go so far as to say that we should strive to read our enemies as if they were our friends and our friends as if they were our enemies.

2. Write carefully.

By which I mean, quite literally, that our writing should be full of care, particularly for those who would read it. The philosopher Stephen Toulmin once observed, “The effort the writer does not put into writing, the reader has to put into reading.” Before we complain about being misunderstood, we should ask ourselves if we’ve done all we can to make ourselves understood. This is no easy task, to be sure. Communicating with nothing but these precious little marks on a screen is a precarious business. But if we are going to attempt it, let us at least do it with as much care as we can muster.

3. Cite abundantly.

Give credit where credit is due. For instance, I’ll note here that I owe my knowledge of the line by Manguel to Alan Jacob’s fine little book, The Pleasures of Reading in an Age of Distraction. When we have learned something from others, we should do them the courtesy of acknowledging their labors.

4. Link mindfully.

If we are going to be a relay in the network, let us at least do so with a view to making our little corner of the network a bit more truthful. So much of what comes across our feeds is inaccurate, misleading, or worse. It is true that we cannot all be full-time fact checkers, of course. But it doesn’t take too much digging to verify the validity of a statistic, a historical claim, or the attribution of authorship. If it raises a red-flag and you don’t have the time or desire to scrutinize the claims, then don’t pass it along even if (especially if) it supports your “side” of things. We can all do better than embracing misinformation as we fight for our causes. Remember, too, that knowingly floating misleading half-truths is the work of serpents and witches. “Tell the truth, and shame the devil!” Shakespeare’s Hotspur declares. Let us do, likewise. Link to the truth, and shame the devil!

5. Share sparingly.

I am thinking here about the claim our sharing makes upon our neighbor’s attention. Our attention is a precious and limited resource; we should guard it vigilantly and we should take some care to keep from taxing the attention of others. It is true that our action alone may not make much of a difference; this could be an ineffectual strategy. And, perhaps, it is also true that those to whom we are connected online may not even see matters as we do. Somehow, though, I think we should learn to take some responsibility for the demands we make of the attention of others.

6. Correct graciously …

… if you must. Sometimes it is enough that we simply do not perpetuate a falsehood; sometimes more may be required of us. Let me be clear about what I mean. I’m not exactly a big fan of the xkcd comics, but this old classic is still valuable.


Yes, someone is always wrong on the Internet, and I’m not suggesting that we go on a crusade against error. Letting it die upon reaching us is often good enough. Occasionally, however, it may be that failing to engage online is a form of morally questionable silence. Use your judgement, lay aside all self-righteousness, and proceed with as much grace as the situation allows. And, of course, let us hear the criticisms others present to us in a similar spirit, remembering that we need not always make reply. Silence in the face of evil may be a moral failure; silence before our critics is sometimes the better part of wisdom.

7. Respect, always.

Respect the privacy of others, respect their time, respect the dignity inherent in their humanity. Do not give offense needlessly. Do not be reckless with another’s reputation. Treat the other human beings with which you interact as ends in themselves. On the other side of every book we read, I remind my students, is another human being. That they may be dead does not alter the fact. On the other side of every post, status update, avatar, and online profile is another human being.* Let us resist the temptation, then, to take them up as mere means in our projects of self-promotion and identity performance.

*Yes, bots and fake accounts complicate this picture, but use your judgment accordingly.

Presidential Debates and Social Media, or Neil Postman Was Right

imageI’ve chosen to take my debates on Twitter. I’ve done so mostly in the interest of exploring what difference it might make to take in the debates on social media rather than on television.

Of course, the first thing to know is that the first televised debate, the famous 1960 Kennedy/Nixon debate, is something of a canonical case study in media studies. Most of you, I suspect, have heard at some point about how polls conducted after the debate found that those who listened on the radio were inclined to think that Nixon had gotten the better of Kennedy while those who watched the debate on television were inclined to think that Kennedy had won the day.

As it turns out, this is something like a political urban legend. At the very least, it is fair to say that the facts of the case are somewhat more complicated. Media scholar, W. Joseph Campbell of American University, leaning heavily on a 1987 article by David L. Vancil and Sue D. Pendell, has shown that the evidence for viewer-listener disagreement is surprisingly scant and suspect. What little empirical evidence did point to a disparity between viewers and listeners depended on less than rigorous methodology.

Campbell, who’s written a book on media myths, is mostly interested in debunking the idea that viewer-listener disagreement was responsible for the outcome of the election. His point, well-taken, is simply that the truth of the matter is more complicated. With this we can, of course, agree. It would be a mistake, however, to write off the consequences over time of the shift in popular media. We may, for instance, take the first Clinton/Trump debate and contrast it to the Kennedy/Nixon debate and also to the famous Lincoln/Douglas debates. It would be hard to maintain that nothing has changed. But what is the cause of that change?

dd-3Does the evolution of media technology alone account for it? Probably not, if only because in the realm of human affairs we are unlikely to ever encounter singular causes. The emergence of new media itself, for instance, requires explanation, which would lead us to consider economic, scientific, and political factors. However, it would be impossible to discount how new media shape, if nothing else, the conditions under which political discourse evolves.

Not surprisingly, I turned to the late Neil Postman for some further insight. Indeed, I’ve taken of late to suggesting that the hashtag for 2016, should we want one, ought to be #NeilPostmanWasRight. This was a sentiment that I initially encountered in a fine post by Adam Elkus on the Internet culture wars. During the course of his analysis, Elkus wrote, “And at this point you accept that Neil Postman was right and that you were wrong.”

I confess that I rather agreed with Postman all along, and on another occasion I might take the time to write about how well Postman’s writing about technology holds up. Here, I’ll only cite this statement of his argument in Amusing Ourselves to Death:

“My argument is limited to saying that a major new medium changes the structure of discourse; it does so by encouraging certain uses of the intellect, by favoring certain definitions of intelligence and wisdom, and by demanding a certain kind of content–in a phrase, by creating new forms of truth-telling.”

This is the argument Postman presents in a chapter aptly title “Media as Epistemology.” Postman went on to add, admirably, “I am no relativist in this matter., and that I believe the epistemology created by television not only is inferior to a print-based epistemology but is dangerous and absurdist.”

Let us make a couple of supporting observations in passing, neither of which is original or particularly profound. First, what is it that we remember about the televised debates prior to the age of social media? Do any of us, old enough to remember, recall anything other than an adroitly delivered one-liner? And you know exactly which I have in mind already. Go ahead, before reading any further, call to mind your top three debate memories. Tell me if at least one of these is not among the three.

Reagan, when asked about his age, joking that we would not make an issue out of his opponent’s youth and inexperience.

Sen. Bentsen reminding Dan Quayle that he is no Jack Kennedy.

Admiral Stockdale, seemingly lost on stage, wondering, “Who am I? Why am I here?”

So how did we do? Did we have at least one of those in common? Here’s my point: what is memorable and what counts for “winning” or “losing” a debate in the age of television had precious little to do with the substance of an argument. It had everything to do with style and image. Again, I claim no great insight in saying as much. In fact, this is, I presume, conventional wisdom by now.

(By the way, Postman gets all the more credit if your favorite presidential debate memories involved an SNL cast member, say Dana Carvey, for example.)

Consider as well an example fresh from the first Clinton/Trump debate.

You tell me what “over-prepared” could possibly mean. Moreover, you tell me if that was a charge that you can even begin to imagine being leveled against Lincoln or Douglas or, for that matter, Nixon or Kennedy.

Let’s let Marshall McLuhan take a shot at explaining what Mr. Todd might possibly have meant.

I know, you’re not going to watch the whole thing. Who’s got the time? [#NeilPostmanWasRight] But if you did, you would hear McLuhan explaining why the 1976 Carter/Ford debate was an “atrocious misuse of the TV medium” and “the most stupid arrangement of any debate in the history of debating.” Chiefly, the content and the medium were mismatched. The style of debating both candidates embodied was ill-suited for what television prized, something approaching casual ease, warmth, and informality. Being unable to achieve that style means “losing” the debate regardless of how well you knew your stuff. As McLuhan tells Tom Brokaw, “You’re assuming that what these people say is important. All that matters is that they hold that audience on their image.”

Incidentally, writing in Slate about this clip in 2011, David Haglund wrote, “What seems most incredible to me about this cultural artifact is that there was ever a time when The Today Show would spend ten uninterrupted minutes talking about the presidential debates with a media theorist.” [#NeilPostmanWasRight]

So where does this leave us? Does social media, like television, present us with what Postman calls a new epistemology? Perhaps. We keep hearing a lot of talk about post-factual politics. If that describes our political climate, and I have little reason to doubt as much, then we did not suddenly land here after the advent of social media or the Internet. Facts, or simply the truth, has been fighting a rear-guard action for some time now.

I will make one passing observation, though, about the dynamics of following a debate on Twitter. While the entertainment on offer in the era of television was the thrill of hearing the perfect zinger, social media encourages each of us to become part of the action. Reading tweet after tweet of running commentary on the debate, from left, right, and center, I was struck by the near unanimity of tone: either snark or righteous indignation. Or, better, the near unanimity of apparent intent. No one, it seems to me, was trying to persuade anybody of anything. Insofar as I could discern a motive factor I might on the one hand suggest something like catharsis, a satisfying expunging of emotions. On the other, the desire to land the zinger ourselves. To compose that perfect tweet that would suddenly go viral and garner thousands of retweets. I saw more than a few cross my timeline–some from accounts with thousands and thousands of followers and others from accounts with a meager few hundred–and I felt that it was not unlike watching someone hit the jackpot in the slot machine next to me. Just enough incentive to keep me playing.

A citizen may have attended a Lincoln/Douglas debate to be informed and also, in part, to be entertained. The consumer of the television era tuned in to a debate ostensibly to be informed, but in reality to be entertained. The prosumer of the digital age aspires to do the entertaining.


How to Think About Memory and Technology

I suppose it is the case that we derive some pleasure from imagining ourselves to be part of a beleaguered but noble minority. This may explain why a techno-enthusiast finds it necessary to attack dystopian science fiction on the grounds that it is making us all fear technology while I find that same notion ludicrous.

Likewise, Salma Noreen closes her discussion of the internet’s effect on memory with the following counsel: “Rather than worrying about what we have lost, perhaps we need to focus on what we have gained.” I find that a curious note on which to close because I tend to think that we are not sufficiently concerned about what we have lost or what we may be losing as we steam full speed ahead into our technological futures. But perhaps I also am not immune to the consolations of belonging to an imagined beleaguered community of my own.

So which is it? Are we a society of techno-skeptics with brave, intrepid techno-enthusiasts on the fringes stiffening our resolve to embrace the happy technological future that can be ours for the taking? Or are we a society of techno-enthusiasts for whom the warnings of the few techno-skeptics are nothing more than a distant echo from an ever-receding past?

I suspect the latter is closer to the truth, but you can tell me how things look from where you’re standing.

My main concern is to look more closely at Noreen’s discussion of memory, which is a topic of abiding interest to me. “What anthropologists distinguish as ‘cultures,’” Ivan Illich wrote, “the historian of mental spaces might distinguish as different ‘memories.'” And I rather think he was right. Along similar lines, and in the early 1970s, George Steiner lamented, “The catastrophic decline of memorization in our own modern education and adult resources is one of the crucial, though as yet little understood, symptoms of an afterculture.” We’ll come back to more of what Steiner had to say a bit further on, but first let’s consider Noreen’s article.

She mentions two studies as a foil to her eventual conclusion. The first suggesting “the internet is leading to ‘digital amnesia’, where individuals are no longer able to retain information as a result of storing information on a digital device,” and the other “that relying on digital devices to remember information is impairing our own memory systems.”

“But,” Noreen counsels her readers, “before we mourn this apparent loss of memory, more recent studies suggest that we may be adapting.” And in what, exactly, does this adaptation consist? Noreen summarizes it this way: “Technology has changed the way we organise information so that we only remember details which are no longer available, and prioritise the location of information over the content itself.”

This conclusion seems to me banal, which is not to say that it is incorrect. It amounts to saying that we will not remember what we do not believe we need to remember and that, when we have outsourced our memory, we will take some care to learn how we might access it in the future.

Of course, when the Google myth dominates a society, will we believe that there is anything at all that we ought to commit to memory? The Google myth in this case is the belief that the every conceivable bit of knowledge that we could ever possibly desire is just a Google search away.

The sort of analysis Noreen offers, which is not uncommon, is based on an assumption we should examine more closely and also leaves a critical consideration unaddressed.

The assumption is that there are no distinctions within the category of memory. All memories are assumed to be discreet facts of the sort which one would need to know in order to do well on Jeopardy. But this assumption ignores the diversity of what we call memories and the diversity of functions to which memory is put. Here is how I framed the matter some years back:

All of this leads me to ask, What assumptions are at play that make it immediately plausible for so many to believe that we can move from internalized memory to externalized memory without remainder?  It would seem, at least, that the ground was prepared by an earlier reduction of knowledge to information or data.  Only when we view knowledge as the mere aggregation of discreet bits of data, can we then believe that it makes little difference whether that data is stored in the mind or in a database.

We seem to be approaching knowledge as if life were a game of Jeopardy which is played well by merely being able to access trivial knowledge at random.  What is lost is the associational dimension of knowledge which constructs meaning and understanding by relating one thing to another and not merely by aggregating data.  This form of knowledge, which we might call metaphorical or analogical, allows us to experience life with the ability to “understand in light of,” to perceive through a rich store of knowledge and experience that allows us to see and make connections that richly texture and layer our experience of reality.

But this understanding of memory seems largely absent from the sorts of studies that are frequently cited in discussions of offloaded or outsourced memory. I’ll add another relevant consideration I’ve previously articulated, that there is a silent equivocation that slips into these discussions: the notion of memory we tend to assume is our current understanding of memory derived by comparison to computer memory, which is essentially storage.

Having first identified a computer’s storage  capacity as “memory,” a metaphor dependent upon the human capacity we call “memory,” we have now come to reverse the direction of the metaphor by understanding human “memory” in light of a computer’s storage capacity.  In other words we’ve reduced our understanding of memory to mere storage of information.  And now we read all discussions of memory in light of this reductive understanding.

As for the unaddressed critical consideration, if we grant that we must all outsource or externalize some of our memory, and that it may even be admittedly advantageous to do so, how do we make qualitative judgments about the memory that we can outsource to our benefit and the memory we should on principle internalize (if we even allow for the latter possibility)?

Here we might take a cue from the religious practices of Jews, Christians, and Muslims, who have long made the memorization of Scripture a central component of their respective forms of piety. Here’s a bit more from Steiner commenting on what can be known about early modern literacy:

Scriptural and, in a wider sense, religious literacy ran strong, particularly in Protestant lands. The Authorized Version and Luther’s Bible carried in their wake a rich tradition of symbolic, allusive, and syntactic awareness. Absorbed in childhood, the Book of Common Prayer, the Lutheran hymnal and psalmody cannot but have marked a broad compass of mental life with their exact, stylized articulateness and music of thought. Habits of communication and schooling, moreover, sprang directly from the concentration of memory. So much was learned and known by heart — a term beautifully apposite to the organic, inward presentness of meaning and spoken being within the individual spirit.

Learned by heart–a beautifully apt phrase, indeed. Interestingly, this is an aspect of religious practice that, while remaining relatively consistent across the transition from oral to literate society, appears to be succumbing to the pressures of the Google myth, at least among Protestants. If I have an app that lets me instantly access any passage of my sacred text, in any of a hundred different translations, why would I bother to memorize any of it.

The answer, of course, best and perhaps only learned by personal experience, is that there is a qualitative difference between the “organic, inward presentness of meaning” that Steiner describes and merely knowing that I know how to find a text if I were inclined to find it. But the Google myth, and the studies that examine it, seem to know nothing of that qualitative difference, or, at least, they choose to bracket it.

I should note in passing that much of what I have recently written about attention is also relevant here. Distraction is the natural state of someone who has no goal that might otherwise command or direct their attention. Likewise, forgetfulness is the natural state of someone who has no compelling reason to commit something to memory. At the heart of both states may be the liberated individual will yielded by modernity. Distraction and forgetfulness seem both to stem from a refusal to acknowledge an order of knowing that is outside of and independent of the solitary self. To discipline our attention and to learn something by heart is, in no small measure, to submit the self to something beyond its own whims and prerogatives.

So, then, we might say that one of the enduring consequences of new forms of externalized memory is not only that they alter the quantity of what is committed to memory but that they also reconfigure the meaning and value that we assign to both the work of remembering and to what is remembered. In this way we begin to see why Illich believed that changing memories amounted to changing cultures. This is also why we should consider that Plato’s Socrates was on to something more than critics give him credit for when he criticized writing for how it would affect memory, which was for Plato much more than merely the ability to recall discreet bits of data.

This last point brings me, finally, to an excellent discussion of these matters by John Danaher. Danaher is always clear and meticulous in his writing and I commend his blog, Philosophical Disquisitions, to you. In this post, he explores the externalization of memory via a discussion of a helpful distinction offered by David Krakauer of the Santa Fe Institute. Here is Danaher’s summary of the distinction between two different types of cognitive artifacts, or artifacts we think with:

Complementary Cognitive Artifacts: These are artifacts that complement human intelligence in such a way that their use amplifies and improves our ability to perform cognitive tasks and once the user has mastered the physical artifact they can use a virtual/mental equivalent to perform the same cognitive task at a similar level of skill, e.g. an abacus.

Competitive Cognitive Artifacts: These are artifacts that amplify and improve our abilities to perform cognitive tasks when we have use of the artifact but when we take away the artifact we are no better (and possibly worse) at performing the cognitive task than we were before.

Danaher critically interacts with Krakauer’s distinction, but finds it useful. It is useful because, like Albert Borgmann’s work, it offers to us concepts and categories by which we might begin to evaluate the sorts of trade-offs we must make when deciding what technologies we will use and how.

Also of interest is Danaher’s discussion of cognitive ecology. Invoking earlier work by Donald Norman, Danaher explains that “competitive cognitive artifacts don’t just replace or undermine one cognitive task. They change the cognitive ecology, i.e. the social and physical environment in which we must perform cognitive tasks.” His critical consideration of the concept of cognitive ecology brings him around to the wonderful work Evan Selinger has been doing on the problem of technological outsourcing, work that I’ve cited here on more than a few occasions. I commend to you Danaher’s post for both its content and its method. It will be more useful to you than the vast majority of commentary you might otherwise encounter on this subject.

I’ll leave you with the following observation by the filmmaker Luis Bunuel: “Our memory is our coherence, our reason, our feeling, even our action. Without it, we are nothing.” Let us take some care and give some thought, then, to how our tools shape our remembering.


A Man Walks Into A Bank

I walked into my bank a few days ago and found that the lobby had a different look. The space had been rearranged to highlight a new addition: an automated teller. While I was being helped, I overheard an exchange between a customer in line behind me and a bank worker whose new role appeared to be determining whether customers could be served by the automated teller and directing them in that direction.

She was upbeat about the automated teller and how it would speed things up for customers. The young man talking with her posed a question that occurred to me as I listened but that I’m not sure I would have had the temerity to raise: “Aren’t you afraid that pretty soon they’re not going to need you guys anymore?”

The bank employee was entirely unperturbed, or at least she pretended to be. “No, I’m not worried about that,” she said. “I know they’re going to keep us around.”

I hope they do, but I don’t share her optimism. I was reminded of passage from Neil Postman’s Technopoly: The Surrender of Culture to Technology. Writing in the early ’90s about the impact of television on education, Postman commented on teachers who enthusiastically embraced the transformations wrought by television. Believing the modern school system, and thus the teacher’s career, to be the product of print culture, Postman wrote,

[…] surely, there is something perverse about schoolteachers’ being enthusiastic about what is happening. Such enthusiasm always calls to my mind an image of some turn-of-the-century blacksmith who not only sings the praises of the automobile but also believes that his business will be enhanced by it. We know now that his business was not enhanced by it; it was rendered obsolete by it, as perhaps the clearheaded blacksmiths knew. What could they have done? Weep, if nothing else.

We might find it in us to weep, too, or at least acknowledge the losses, even when the gains are real and important, which they are not always. Perhaps we might also refuse a degree of personal convenience from time to time, or every time if we find it in us to do so, in order to embody principles that might at least, if nothing else, demonstrate a degree of solidarity with those who will not be the winners in the emerging digital economy.

Postman believed that computer technology created a similar situation to that of the blacksmiths, “for here too we have winners and losers.”

“There can be no disputing that the computer has increased the power of large-scale organizations like the armed forces, or airline companies or banks or tax-collecting agencies. And it is equally clear that the computer is now indispensable to high-level researchers in physics and other natural sciences. But to what extend has computer technology been an advantage to the masses of people? To steelworkers, vegetable-store owners, teachers, garage mechanics, musicians, bricklayers, dentists, and most of the rest into whose lives the computer now intrudes? Their private matters have been made more accessible to powerful institutions. They are more easily tracked and controlled; are subjected to more examinations; are increasingly mystified by the decisions made about them; are often reduced to mere numerical objects. They are inundated by junk mail. They are easy targets for advertising agencies …. In a word, almost nothing that they need happens to the losers. Which is why they are the losers.

It is to be expected that the winners will encourage the losers to be enthusiastic about computer technology. That is the way of winners … They also tell them that their lives will be conducted more efficiently. But discreetly they neglect to say from whose point of view the efficiency is warranted or what might be its costs.”

The religion of technology is a secular faith, and as such it should, at least, have the decency of striking a tragic note.

Attention and the Moral Life

I’ve continued to think about a question raised by Frank Furedi in an otherwise lackluster essay about distraction and digital devices. Furedi set out to debunk the claim that digital devices are undermining our attention and our memory. I don’t think he succeeded, but he left us with a question worth considering: “The question that is rarely posed by advocates of the distraction thesis is: what are people distracted from?”

In an earlier post, I suggested that this question can be usefully set alongside a mid-20th century observation by Hannah Arendt. Considering the advent of automation, Arendt feared “the prospect of a society of laborers without labor, that is, without the only activity left to them.” “Surely, nothing could be worse,” she added.

The connection might not have been as clear as I imagined it, so let me explain. Arendt believed that labor is the “only activity left” to the laborer because the glorification of labor in modern society had eclipsed the older ends and goods to which labor had been subordinated and for the sake of which we might have sought freedom from labor.

To put it as directly as I can, Arendt believed that if we indeed found ourselves liberated from the need to labor, we would not know what to do with ourselves. We would not know what to do with ourselves because, in the modern world, laboring had become the ordering principle of our lives.

Recalling Arendt’s fear, I wondered whether we were not in a similar situation with regards to attention. If we were able to successfully challenge the regime of digital distraction, to what would we give the attention that we would have fought so hard to achieve? Would we be like the laborers in Arendt’s analysis, finally free but without anything to do with our freedom? I wondered, as well, if it were not harder to combat distraction, if we were inclined to do so, precisely because we had no telos for the sake of which we might undertake the struggle.

Interestingly, then, while the link between Arendt’s comments about labor and the question about the purpose of attention was initially only suggestive, I soon realized the two were more closely connected. They were connected by the idea of leisure.

We tend to think of leisure merely as an occasional break from work. That is not, however, how leisure was understood in either classical or medieval culture. Josef Pieper, a Catholic philosopher and theologian, was thinking about the cultural ascendency of labor or work and the eclipse of leisure around the same time that Arendt was articulating her fears of a society of laborers without labor. In many respects, their analysis overlaps. (I should note, though, that Arendt distinguishes between labor and work in way that Pieper does not. Work for Pieper is roughly analogous to labor in Arendt’s taxonomy.)

For her part, Arendt believed nothing could be worse than liberating laborers from labor at this stage in our cultural evolution, and this is why:

“The modern age has carried with it a theoretical glorification of labor and has resulted in a factual transformation of the whole of society into a laboring society.  The fulfillment of the wish, therefore, like the fulfillment of wishes in fairy tales, comes at a moment when it can only be self-defeating. It is a society of laborers which is about to be liberated from the fetters of labor, and this society does no longer know of those other higher and more meaningful activities for the sake of which this freedom would deserve to be won. Within this society, which is egalitarian because this is labor’s way of making men live together, there is no class left, no aristocracy of either a political or spiritual nature from which a restoration of the other capacities of man could start anew.”

To say that there is “no aristocracy of either a political or spiritual nature” is another way of saying that there is no leisured class in the older sense of the word. This older ideal of leisure did not entail freedom from labor for the sake of endless poolside lounging while sipping Coronas. It was freedom from labor for the sake of intellectual, political, moral, or spiritual aims, the achievement of which may very well require arduous discipline. We might say that it was freedom from the work of the body that made it possible for someone to take up the work of the soul or the mind. Thus Pieper can claim that leisure is “a condition of the soul.” But, we should also note, it was not necessarily a solitary endeavor, or, better, it was not an endeavor that had only the good of the individual in mind. It often involved service to the political or spiritual community.

Pieper further defines leisure as “a form of that stillness that is the necessary preparation for accepting reality; only the person who is still can hear, and whoever is not still cannot hear.” He makes clear, though, that the stillness he has in mind “is not mere soundlessness or a dead muteness; it means, rather, that the soul’s power, as real, of responding to the real – a co-respondence, eternally established in nature – has not yet descended into words.” Thus, leisure “is the disposition of receptive understanding, of contemplative beholding, and immersion – in the real.”

Pieper also claims that leisure “is only possible on the assumption that man is not only in harmony with himself, whereas idleness is rooted in the denial of this harmony, but also that he is in agreement with the world and its meaning. Leisure lives on affirmation.” The passing comment on idleness is especially useful to us.

In our view, leisure and idleness are nearly indistinguishable. But on the older view, idleness is not leisure; indeed, it is the enemy of leisure. Idleness, on the older view, may even take the shape of frenzied activity undertaken for the sake of, yes, distracting us from the absence of harmony or agreement with ourselves and the world.

We are now inevitably within the orbit of Blaise Pascal’s analysis of the restlessness of the human condition. Because we are not at peace with ourselves or our world, we crave distraction or what he called diversions. “What people want,” Pascal insists, “is not the easy peaceful life that allows us to think of our unhappy condition, nor the dangers of war, nor the burdens of office, but the agitation that takes our mind off it and diverts us.” “Nothing could be more wretched,” Pascal added, “than to be intolerably depressed as soon as one is reduced to introspection with no means of diversion.”

The novelist Walker Percy, a younger contemporary of both Arendt and Pieper, described what we called the “diverted self” as follows: “In a free and affluent society, the self is free to divert itself endlessly from itself.  It works in order to enjoy the diversions that the fruit of one’s labor can purchase.”  For the diverted self, Percy concluded, “The pursuit of happiness becomes the pursuit of diversion.”

If leisure is a condition of the soul as Pieper would have it, then might we also say the same of distraction? Discreet instances of being distracted, of failing to meaningfully direct our attention, would then be symptoms of a deeper disorder. Our digital devices, in this framing of distraction, are both a material cause and an effect. The absence of digital devices would not cure us of the underlying distractedness or aimlessness, but their presence preys upon, exacerbates, and amplifies this inner distractedness.

It is hard, at this point, for me not to feel that I have been speaking in another language or at least another dialect, one whose cadences and lexical peculiarities are foreign to our own idiom and, consequently, to our way of making sense of our experience. Leisure, idleness, contemplative beholding, spiritual and political aristocracies–all of this recalls to mind Alasdair MacIntyre’s observation that we use such words in much the same way that a post-apocalyptic society, picking up the scattered pieces of the modern scientific enterprise would use “neutrino,” “mass,” and “specific gravity”: not entirely without meaning, perhaps, but certainly not as scientists. The language I’ve employed, likewise, is the language of an older moral vision, a moral vision that we have lost.

I’m not suggesting that we ought to seek to recover the fullness of the language or the world that gave it meaning. That would not be possible, of course. But what if we, nonetheless, desired to bring a measure of order to the condition of distraction that we might experience as an affliction? What if we sought some telos to direct and sustain our attention, to at least buffer us from the forces of distraction?

If such is the case, I commend to you Simone Weil’s reflections on attention and will. Believing that the skill of paying attention cultivated in one domain was transferable to another, Weil went so far as to claim that the cultivation of attention was the real goal of education: “Although people seem to be unaware of it today, the development of the faculty of attention forms the real object and almost the sole interest of studies.”

It was Weil who wrote, “Attention is the rarest and purest form of generosity.” A beautiful sentiment grounded in a deeply moral understanding of attention. Attention, for Weil, was not merely an intellectual asset, what we require for the sake of reading long, dense novels. Rather, for Weil, attention appears to be something foundational to the moral life:

“There is something in our soul that loathes true attention much more violently than flesh loathes fatigue. That something is much closer to evil than flesh is. That is why, every time we truly give our attention, we destroy some evil in ourselves.”

Ultimately, Weil understood attention to be a critical component of the religious life as well. “Attention, taken to its highest degree,” Weil wrote, “is the same thing as prayer. It presupposes faith and love.” “If we turn our mind toward the good,” she added, “it is impossible that little by little the whole soul will not be attracted thereto in spite of itself.” And this is because, in her view, “We have to try to cure our faults by attention and not by will.”

So here we have, if we wanted it, something to animate our desire to discipline the distracted self, something at which to direct our attention. Weil’s counsel was echoed closer to our own time by David Foster Wallace, who also located the goal of education in the cultivation of attention.

“Learning how to think really means learning how to exercise some control over how and what you think,” Wallace explained in his now famous commencement address at Kenyon College. “It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience.”

“The really important kind of freedom,” Wallace added, “involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day. That is real freedom. That is being educated, and understanding how to think.” Each day the truth of this claim impresses itself more and more deeply upon my mind and heart.

Finally, and briefly, we should be wary of imagining the work of cultivating attention as merely a matter of learning how to consciously choose what we will attend to at any given moment. That is part of it to be sure, but Weil and Pieper both knew that attention also involved an openness to what is, a capacity to experience the world as gift. Cultivating our attention in this sense is not a matter of focusing upon an object of attention for our own reasons, however noble those may be. It is also a matter of setting to one side our projects and aspirations that we might be surprised by what is there. “We do not obtain the most precious gifts by going in search of them,” Weil wrote, “but by waiting for them.” In this way, we prepare for “some dim dazzling trick of grace,” to borrow a felicitous phrase from Walker Percy, that may illumine our minds and enliven our hearts.

It is these considerations, then, that I would offer in response to Furedi’s question, What are we distracted from?