Attention and the Moral Life

I’ve continued to think about a question raised by Frank Furedi in an otherwise lackluster essay about distraction and digital devices. Furedi set out to debunk the claim that digital devices are undermining our attention and our memory. I don’t think he succeeded, but he left us with a question worth considering: “The question that is rarely posed by advocates of the distraction thesis is: what are people distracted from?”

In an earlier post, I suggested that this question can be usefully set alongside a mid-20th century observation by Hannah Arendt. Considering the advent of automation, Arendt feared “the prospect of a society of laborers without labor, that is, without the only activity left to them.” “Surely, nothing could be worse,” she added.

The connection might not have been as clear as I imagined it, so let me explain. Arendt believed that labor is the “only activity left” to the laborer because the glorification of labor in modern society had eclipsed the older ends and goods to which labor had been subordinated and for the sake of which we might have sought freedom from labor.

To put it as directly as I can, Arendt believed that if we indeed found ourselves liberated from the need to labor, we would not know what to do with ourselves. We would not know what to do with ourselves because, in the modern world, laboring had become the ordering principle of our lives.

Recalling Arendt’s fear, I wondered whether we were not in a similar situation with regards to attention. If we were able to successfully challenge the regime of digital distraction, to what would we give the attention that we would have fought so hard to achieve? Would we be like the laborers in Arendt’s analysis, finally free but without anything to do with our freedom? I wondered, as well, if it were not harder to combat distraction, if we were inclined to do so, precisely because we had no telos for the sake of which we might undertake the struggle.

Interestingly, then, while the link between Arendt’s comments about labor and the question about the purpose of attention was initially only suggestive, I soon realized the two were more closely connected. They were connected by the idea of leisure.

We tend to think of leisure merely as an occasional break from work. That is not, however, how leisure was understood in either classical or medieval culture. Josef Pieper, a Catholic philosopher and theologian, was thinking about the cultural ascendency of labor or work and the eclipse of leisure around the same time that Arendt was articulating her fears of a society of laborers without labor. In many respects, their analysis overlaps. (I should note, though, that Arendt distinguishes between labor and work in way that Pieper does not. Work for Pieper is roughly analogous to labor in Arendt’s taxonomy.)

For her part, Arendt believed nothing could be worse than liberating laborers from labor at this stage in our cultural evolution, and this is why:

“The modern age has carried with it a theoretical glorification of labor and has resulted in a factual transformation of the whole of society into a laboring society.  The fulfillment of the wish, therefore, like the fulfillment of wishes in fairy tales, comes at a moment when it can only be self-defeating. It is a society of laborers which is about to be liberated from the fetters of labor, and this society does no longer know of those other higher and more meaningful activities for the sake of which this freedom would deserve to be won. Within this society, which is egalitarian because this is labor’s way of making men live together, there is no class left, no aristocracy of either a political or spiritual nature from which a restoration of the other capacities of man could start anew.”

To say that there is “no aristocracy of either a political or spiritual nature” is another way of saying that there is no leisured class in the older sense of the word. This older ideal of leisure did not entail freedom from labor for the sake of endless poolside lounging while sipping Coronas. It was freedom from labor for the sake of intellectual, political, moral, or spiritual aims, the achievement of which may very well require arduous discipline. We might say that it was freedom from the work of the body that made it possible for someone to take up the work of the soul or the mind. Thus Pieper can claim that leisure is “a condition of the soul.” But, we should also note, it was not necessarily a solitary endeavor, or, better, it was not an endeavor that had only the good of the individual in mind. It often involved service to the political or spiritual community.

Pieper further defines leisure as “a form of that stillness that is the necessary preparation for accepting reality; only the person who is still can hear, and whoever is not still cannot hear.” He makes clear, though, that the stillness he has in mind “is not mere soundlessness or a dead muteness; it means, rather, that the soul’s power, as real, of responding to the real – a co-respondence, eternally established in nature – has not yet descended into words.” Thus, leisure “is the disposition of receptive understanding, of contemplative beholding, and immersion – in the real.”

Pieper also claims that leisure “is only possible on the assumption that man is not only in harmony with himself, whereas idleness is rooted in the denial of this harmony, but also that he is in agreement with the world and its meaning. Leisure lives on affirmation.” The passing comment on idleness is especially useful to us.

In our view, leisure and idleness are nearly indistinguishable. But on the older view, idleness is not leisure; indeed, it is the enemy of leisure. Idleness, on the older view, may even take the shape of frenzied activity undertaken for the sake of, yes, distracting us from the absence of harmony or agreement with ourselves and the world.

We are now inevitably within the orbit of Blaise Pascal’s analysis of the restlessness of the human condition. Because we are not at peace with ourselves or our world, we crave distraction or what he called diversions. “What people want,” Pascal insists, “is not the easy peaceful life that allows us to think of our unhappy condition, nor the dangers of war, nor the burdens of office, but the agitation that takes our mind off it and diverts us.” “Nothing could be more wretched,” Pascal added, “than to be intolerably depressed as soon as one is reduced to introspection with no means of diversion.”

The novelist Walker Percy, a younger contemporary of both Arendt and Pieper, described what we called the “diverted self” as follows: “In a free and affluent society, the self is free to divert itself endlessly from itself.  It works in order to enjoy the diversions that the fruit of one’s labor can purchase.”  For the diverted self, Percy concluded, “The pursuit of happiness becomes the pursuit of diversion.”

If leisure is a condition of the soul as Pieper would have it, then might we also say the same of distraction? Discreet instances of being distracted, of failing to meaningfully direct our attention, would then be symptoms of a deeper disorder. Our digital devices, in this framing of distraction, are both a material cause and an effect. The absence of digital devices would not cure us of the underlying distractedness or aimlessness, but their presence preys upon, exacerbates, and amplifies this inner distractedness.

It is hard, at this point, for me not to feel that I have been speaking in another language or at least another dialect, one whose cadences and lexical peculiarities are foreign to our own idiom and, consequently, to our way of making sense of our experience. Leisure, idleness, contemplative beholding, spiritual and political aristocracies–all of this recalls to mind Alasdair MacIntyre’s observation that we use such words in much the same way that a post-apocalyptic society, picking up the scattered pieces of the modern scientific enterprise would use “neutrino,” “mass,” and “specific gravity”: not entirely without meaning, perhaps, but certainly not as scientists. The language I’ve employed, likewise, is the language of an older moral vision, a moral vision that we have lost.

I’m not suggesting that we ought to seek to recover the fullness of the language or the world that gave it meaning. That would not be possible, of course. But what if we, nonetheless, desired to bring a measure of order to the condition of distraction that we might experience as an affliction? What if we sought some telos to direct and sustain our attention, to at least buffer us from the forces of distraction?

If such is the case, I commend to you Simone Weil’s reflections on attention and will. Believing that the skill of paying attention cultivated in one domain was transferable to another, Weil went so far as to claim that the cultivation of attention was the real goal of education: “Although people seem to be unaware of it today, the development of the faculty of attention forms the real object and almost the sole interest of studies.”

It was Weil who wrote, “Attention is the rarest and purest form of generosity.” A beautiful sentiment grounded in a deeply moral understanding of attention. Attention, for Weil, was not merely an intellectual asset, what we require for the sake of reading long, dense novels. Rather, for Weil, attention appears to be something foundational to the moral life:

“There is something in our soul that loathes true attention much more violently than flesh loathes fatigue. That something is much closer to evil than flesh is. That is why, every time we truly give our attention, we destroy some evil in ourselves.”

Ultimately, Weil understood attention to be a critical component of the religious life as well. “Attention, taken to its highest degree,” Weil wrote, “is the same thing as prayer. It presupposes faith and love.” “If we turn our mind toward the good,” she added, “it is impossible that little by little the whole soul will not be attracted thereto in spite of itself.” And this is because, in her view, “We have to try to cure our faults by attention and not by will.”

So here we have, if we wanted it, something to animate our desire to discipline the distracted self, something at which to direct our attention. Weil’s counsel was echoed closer to our own time by David Foster Wallace, who also located the goal of education in the cultivation of attention.

“Learning how to think really means learning how to exercise some control over how and what you think,” Wallace explained in his now famous commencement address at Kenyon College. “It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience.”

“The really important kind of freedom,” Wallace added, “involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day. That is real freedom. That is being educated, and understanding how to think.” Each day the truth of this claim impresses itself more and more deeply upon my mind and heart.

Finally, and briefly, we should be wary of imagining the work of cultivating attention as merely a matter of learning how to consciously choose what we will attend to at any given moment. That is part of it to be sure, but Weil and Pieper both knew that attention also involved an openness to what is, a capacity to experience the world as gift. Cultivating our attention in this sense is not a matter of focusing upon an object of attention for our own reasons, however noble those may be. It is also a matter of setting to one side our projects and aspirations that we might be surprised by what is there. “We do not obtain the most precious gifts by going in search of them,” Weil wrote, “but by waiting for them.” In this way, we prepare for “some dim dazzling trick of grace,” to borrow a felicitous phrase from Walker Percy, that may illumine our minds and enliven our hearts.

It is these considerations, then, that I would offer in response to Furedi’s question, What are we distracted from?

Freedom From Authenticity

Last night I listened to a recording of David Foster Wallace’s Kenyon College commencement address.  I know, I know. Wallace is one of these people around whom personality cults form, and its hard to take those people seriously. If it helps, there’s this one guy who is really ticked at Wallace for what must have been some horrible thing Wallace did to him, like having had the temerity to be alive at the same time as he. I also know that Wallace could at times be a rather nasty human being, or so some have reported. That said, the man said some really important and true things which need to be heard again and again.

These things as it turns out, or as I hear them now, in this particular frame of mind that I am in, have everything to do with authenticity. This is not because Wallace is talking directly about authenticity and its discontents, but because he understands, intimately it seems, what it feels like to be the sort of person for whom authenticity is likely to become a problem, and without intending to propose a solution to this problem of authenticity, he does.

Authenticity becomes a problem the second it becomes a question. As William Deresiewicz put it, “the search for authenticity is futile. If you have to look for it, you’re not going to find it.” Authenticity, like happiness and love and probably everything that is truly significant in life partakes of this dynamic whereby the sought after thing can be attained only by not consciously seeking after it. Think of it, and now it is a problem; seek it, and you will not find it; focus on it, and it becomes elusive.

So authenticity is the sort of thing that vanishes the moment you become conscious of it. It’s what you have only when you’re not thinking of it. And what you’re not thinking of when you have it is yourself. Authenticity is a crisis of self invoked by a hyper-selfawareness that makes it impossible not think of oneself. And I don’t think this is a matter of being a horribly selfish or arrogant person. No, in fact, I think this kind of hype-rselfawareness is more often than not burdened with insecurity and fear and anxiety. It’s a voice most people want to shut up and hence the self-defeating quest for authenticity.

What does Wallace have to say about any of this? Well, first, there’s this: “Here is just one example of the total wrongness of something I tend to be automatically sure of: everything in my own immediate experience supports my deep belief that I am the absolute centre of the universe; the realest, most vivid and important person in existence. We rarely think about this sort of natural, basic self-centredness because it’s so socially repulsive.”

This is what he calls our default setting. Our default setting is to think about the world as if we were its center, to process every situation through the grid of our own experience, to assume “that my immediate needs and feelings are what should determine the world’s priorities.” This is our default setting in part because from the perspective of our own experience, the only perspective to which we have immediate access, we are literally the center of the universe.

Wallace also issued this warning: “Worship power you will end up feeling weak and afraid, and you will need ever more power over others to numb you to your own fear. Worship your intellect, being seen as smart, you will end up feeling stupid, a fraud, always on the verge of being found out.”

So then, worship authenticity and … 

But, Wallace also tells us, it doesn’t have to be this way. The point of a liberal arts education — this is a commencement address after all — is to teach us how to exercise choice over what we think and what we pay attention to. And Wallace urges us to pay attention to something other than the monologue inside our head. Getting out of our own heads, what Wallace called our “skull-sized kingdoms” — this is the only answer to the question of authenticity.

And so this makes me think again of the possibility that certain kinds of practices that help us do just this. They can so focus our attention on themselves, that we stop, for a time, paying attention to ourselves. Serendipitously, I stumbled on this video about glass-blowing in which a glass-blower is talking about his craft when he says this: “When you’re blowing glass, there really isn’t time to have your mind elsewhere – you have to be 100% engaged.” There it is.

Now, I know, we can’t all run off and take up glass blowing. That would be silly and potentially dangerous. The point is that this practice has the magical side effect of taking a person out of their own head by acutely focusing our attention. The leap I want to make now is to say that this skill is transferable. Learn the mental discipline of so focusing your attention in one particular context and you will be better able to deploy it in other circumstances.

It’s like the ascetic practice of fasting. The point is not that food is bad or that denying yourself food is somehow virtuous or meritorious. Its about training the will and learning how to temper desire so as to direct and deploy it toward more noble ends. You train your will with food so that you can exercise it meaningfully in other, more serious contexts.

In any case, Wallace is right. It’s hard work not yielding to our default self-centeredness. “The really important kind of freedom,” Wallace explained, “involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day.” I know I’ve cited that line before and not that long ago, but the point it makes is crucial.

Freedom is not about being able to do whatever we want, when we want. It has nothing to do with listening to our heart or following our dreams or whatever else we put on greeting cards and bumper stickers. Real freedom comes from learning to get out of our “skull-sized kingdoms” long enough to pay attention to the human being next us so that we might treat them with decency and kindness and respect. Then perhaps we’ll have our authenticity, but we’ll have it because we’ve stopped caring about it.


A transcript of Wallace’s address is available here.

Mindfulness Is Not Merely Subtraction

Mindfulness is not merely negation, subtraction, or reduction.

This was the thought that occurred to me as I read Miranda Ward’s reflections on her inadvertent break from the Internet, which concluded with the following observation:

“Why can’t we at least acknowledge that, with or without the internet, we still have to work hard, fight distraction, fight depression, and succumb, every once in awhile, to paralysing self-doubt? So it was nice, while I was on holiday, not to have any mobile phone reception. It’s also nice to be able to video chat with my 86-year-old grandmother in California. Disconnected, connected, whatever: I’m still fallible.”

Indeed, we are all fallible. If we assume that merely withdrawing from certain facets of digital life will by itself render us supremely attentive and mindful individuals, then we are certainly in for a rather disheartening disappointment.

That said, I do think the little word merely is essential. Mindfulness is more, not less than what I’ve called attentional austerity. To put it otherwise, attentional austerity is a necessary, but not sufficient cause of mindfulness. It’s not a matter of starving attention, but training and directing it.

Ordinarily, mindfulness is a habituated response, not a spontaneous reaction. Habituated responses arise out of our practices. If our online practices undermine mindfulness, then moderating these practices becomes part of the solution.

Learning to establish and abide by certain limits is, after all, an indispensable discipline. But imposing limits for their own sake is at best unhelpful and at worst destructive. Limits, as Wendell Berry has written, are best understood as “inducements to formal elaboration and elegance, to fullness of relationship and meaning.” They are for something. 

Mindfulness must be for something. It is about fostering a certain kind of attention and learning to deploy it toward certain ends and not others. 

While doing whatever we call the Twitter equivalent of eavesdropping on an exchange centered on David Foster Wallace and the idea of mindfulness, I was reminded of Wallace’s Kenyon College commencement address in which he makes the following observation:

“The really important kind of freedom involves attention, and awareness, and discipline, and effort, and being able truly to care about other people and to sacrifice for them, over and over, in myriad petty little unsexy ways, every day. That is real freedom. The alternative is unconsciousness, the default-setting, the ‘rat race’ — the constant gnawing sense of having had and lost some infinite thing.”

Mindfulness, in Wallace’s view, is about redirecting our attention toward others; and not only toward others, but toward others as ends in themselves (to put a Kantian spin on it). This latter qualification is necessary because we very often direct our attention upon others, but only for the sake of having ourselves reflected back to us.

There are, of course, other legitimate ends toward which mindfulness may aspire. The point is this: We ought not to be for or against the Internet in itself. We ought to be for the kind of loving mindfulness Wallace advocates — to take one example — and then we ought to measure our practices, all of them, online or off, by how well they support such loving mindfulness.

Too Visible to Be Seen

The late David Foster Wallace opened his well-regarded Kenyon College commencement address of 2005 with a joke*:

“There are these two young fish swimming along and they happen to meet an older fish swimming the other way, who nods at them and says “Morning, boys. How’s the water?” And the two young fish swim on for a bit, and then eventually one of them looks over at the other and goes ‘What the hell is water?'”

The point, of course, is that we tend to lose sight of the most pervasive realities. Or, as Wallace put it, “The immediate point of the fish story is that the most obvious, ubiquitous, important realities are often the ones that are the hardest to see and talk about.” This is, as Wallace went on to say, a rather banal observation to make. And yet, it’s not. Or at least, it is an observation that we must make over and over again because, by its very nature, it slips unnoticed from consciousness.

In “The Machine Stops,” an early story of science-fiction by E. M. Forester, the Machine drones on incessantly but the noise is never noticed because it is never not present. In a very different context, C. S. Lewis wrote, “The music which is too familiar to be heard enfolds us day and night and in all ages.” Too familiar to be heard. Too familiar to be seen. Too familiar to be noticed. The most pervasive forms of visibility fade into invisibility. And so it is with all of our senses. There is a paradoxical threshold past which a sensation is too pronounced to be any longer noticed. My understanding is that Hegel made a similar observation about the invisibility of the familiar, but I don’t pretend to be conversant with Hegel.

In any case, the point is simply this: we tend to be disconcertingly unaware of the realities which most profoundly make us the sort of people we are and that give shape to our day-to-day existence.

Sociologist Arnold Gehlen divided culture into background and foreground. He understood this in terms of the choices that present themselves to us. We experience the foreground of culture as a realm in which choices are before us. The background appears to us a realm in which choices are foreclosed. In reality, we do have choices in both cases; but the background elements of culture present themselves with such taken-for-granted force that the choice remains veiled.

In the classic example, we chose what clothes to wear this morning (foreground), but whether or not to wear clothes at all did not present itself to us as a choice (background). Again, ubiquity and pervasiveness serve to blind us. Now putting it that way is unnecessarily pejorative. In fact, we probably couldn’t get very far as individuals or as a society if certain decisions had not moved into the background of culture.

I bring all of this up to register a corollary point regarding technology. Ubiquitous technologies that recede into the realm of shadowy familiarity are perhaps best positioned to exercise a formative influence over us precisely because we have stopped thinking about them.

So take a look around. What technologies have worked their way into the background of our lives, ever present and unnoticed? What choices do they veil? What assumptions to they engender? What patterns of life do they facilitate? What have they led us to take for granted?

These will all be difficult questions to answer — thinking about them is not unlike trying to jump over your own shadow — but we’d better try and keep trying if we’re to live well-ordered lives.


* I was reminded of this little story while reading a fine essay titled, “Orangic, Locally-Grown Technology.”

Teaching What It Feels Like To Be Alive

… it’s the stuff that’s about what it feels like to live.  Instead of being a relief from what it feels like to live.

That is how David Foster Wallace, in Although of Course You End Up Becoming Yourself, contrasted traditional literature with its coherent narrative and a satisfying sense of closure, to experimental or avant-garde literature which typically exhibits neither.  I’ve been thinking about that contrast since I posted the passage a few weeks ago.  Writing that is experienced as a relief from what it feels like to be alive and writing that reflects what it feels like to be alive — I’m wondering if that same distinction could also be usefully applied to teaching.  Can teaching, in the same way, reflect what it feels like to be alive, rather than be a relief from it?

Literature and teaching are both components of the ongoing, ramshackle project we call our education.   When I am most hopeful about what a teacher can do, I see it as not unlike what a very good book might also accomplish.  We might describe it as the opening up of new and multiple vistas into both the world and ourselves.  A good book offers a challenging engagement with reality, rather than the mere escapism that some literature proffers instead.  To borrow a line from Bridge to Terabithia, good teaching, likewise, pushes students to see beyond their own secret countries, to see and to feel what lies beyond and within.  Of course, on my less hopeful (read, more curmudgeonly) days, I feel that convincing students that a book can work in that way is itself the necessary task.

What, then, might it mean to teach so as to reflect what it feels like to be alive?

For one thing, it involves feeling; it is affective.  It reaches beyond the transfer of information to the mind, and seeks to move the heart as well.  This matters principally because while we go about the work and play of living we tend to lead with our hearts and not with our minds (for better and/or for worse).

But in order to move the heart, the heart must be susceptible to being moved.  The numbness that threatens always to settle on us as wave upon wave of stimulation washes over us gently massaging us into a state of mildly amused indifference to reality must be overcome.  This numbness itself might be self-protective, but, while self-knowledge has a distinguished place in the history of education, self-preservation seems a less noble aspiration.  Teaching that leads to feeling must find a way to break this through this self-protective numbness.  Of course, that numbness is itself part of what it feels like to be alive, but it is the part that must first be encountered, acknowledged, and transcended in order to feel all the rest.

Like the artist in Wallace’s view, the teacher has the license and the responsibility

to sit, clench their fists, and make themselves be excruciatingly aware of the stuff that we’re mostly aware of only on a certain level.  And that if the writer [or teacher] does his job right, what he basically does is remind the reader [or student] of how smart the reader [or student] is.

The teacher, like the writer, must themselves be sensitive to what it feels like to be alive so as to teach to that feeling and help students understand it, understand themselves.  Perhaps it is precisely here that teaching has failed students, in the inability to enter into the student’s world so as to speak meaningfully into it.

The trick, of course, is also to do so without falling into the equivalent of what Wallace calls “shitty avant garde,” literature that tries too hard and ignores the reader in its effort to be profound. Trying too hard to achieve this effect without authenticity is fatal.  Likewise with teaching.  Watching Lean on Me or Dead Poet’s Society one too many times will likely do more harm than good.

Good writing and good teaching are both grounded in a deep respect for the reader and the student, not in an inordinate desire to be inspiring.  This is what finally stuck me most forcefully in Wallace’s comments.  His work, his estimation of what literature could do, flowed from a remarkable confidence in the reader.  Perhaps then this is also where good teaching must begin, with an equal respect for and confidence in the student.