Notes Toward Understanding Our Technologically Enchanted World

Three years ago, reflecting on developments in the realm of “smart” technology, I suggested that it might be best to understand modernity not as a disenchanted realm but rather as an alternatively enchanted realm. I’ve continued to think on and off about this claim since then, and I remain convinced of its usefulness. I’ll be posting about it occasionally in the coming weeks, sometimes just to present a few relevant excerpts or notes on the topic. Below are some selections from Lee Worth Bailey’s The Enchantments of Technology. Bailey develops the idea of enchantment in a way that is useful, although I’ll ultimately take the term in slightly different directions. For Bailey, technologies are enchanted insomuch as they cannot be understood apart from acknowledged and unacknowledged human desires, passions, aspirations, etc.

“Enchantments,” in Bailey’s understanding, “are common, ever-present factors of consciousness, whether mild or strong, denied or obvious, positive or negative.” He goes on to add, “Enchantments introduce certain meanings into cultural life that take on a serious, rational tone but have a deep undercurrent of emotional and imaginative power.”

“Just below the surface, apparently ‘pure’ rationality is in bed with enchantments.”

“When we examine enchantments we go deeper still, into the unconscious depths that shape our motives, values, and decisions in the dark basement of the soul. Then we see that our machinery is not only a utilitarian necessity, or an autonomous realm of deterministic forces, but rather enchanted technologies designed to slake our endless thirst for speed, comfort, pleasure, power, and even transcendence.”

Max Weber, quoted by Bailey: “The fate of our times is characterized by rationalization and intellectualization and, above all, by the ‘disenchantment of the world.’ Precisely the ultimate and most sublime values have retreated from public life either into the transcendental realm or mystic life or into the brotherliness of direct and personal human relation.” Note that Weber does not here characterize disenchantment merely as a matter of subtraction or deletion. Rather, it is a matter of retreat or migration, specifically out of the public realm into varieties private experience.

Apparent disenchantment “is a strong surface phenomenon, and many valuable benefits have come out of it. But underneath surges a vast sea of unacknowledged, influential desires, passions, and quests for spirituality.”

“Technology does not inhabit a neutral world of pure space, time, causation, and reason. Rather, technology’s lifeworld is imbued with imagination, purpose, ethics, motivation, and meaning.”

“How many soldiers using gunpowder against opponents with spears resisted the desire to feel absolutely powerful?”

Next in the series: Technological Enchantments and the End of Modernity and Notes Toward An Understanding of Our Technologically Enchanted World, 2.


If you’ve appreciated what you’ve read, consider supporting the writer.

The Technological Origins of Protestantism, or the Martin Luther Tech Myth

[Caveat lector: the first part of the title is a bit too grandiose for what follows. Also, this post addresses the relationship between technology and religion, more specifically, the relationship between technology and Protestant Christianity. This may narrow the audience, but I suspect there is something of interest here for most readers. Finally, big generalizations ahead. Carry on.]

This year marks the 500th anniversary of the start of the Protestant Reformation. The traditional date marking the beginning of the Reformation is October 31, 1517. It was on that day, All Hallow’s (or Saints) Eve, that Martin Luther posted his famous Ninety-five Theses on a church door in Wittenberg. It is fair to say that no one then present, including Luther, had any idea the magnitude of what was to follow.

Owing to the anniversary, you might encounter a slew of new books about Luther, the Reformation(s), and its consequences. You might stumble across a commemorative Martin Luther Playmobil set. You might even learn about a church in Wittenberg which has deployed a robot named … wait for it … BlessU-2 to dispense blessings and Bible verses to visitors (free of charge, Luther would have been glad to know).

Then, of course, there are the essays and articles in popular journals and websites, and, inevitably, the clever takes that link Luther to contemporary events. Take, for example, this piece in Foreign Policy arguing that Luther was the Donald Trump of 1517. The subtitle goes on to claim that “if the leader of the reformation could have tweeted the 95 theses, he would have.” I’ll get back to the subtitle in just a moment, but, let’s be clear, the comparison is ultimately absurd. Sure, there are some very superficial parallels one might draw, but even the author of the article understands their superficiality. Throughout the essay, he walks back and qualifies his claims. “But in the end Luther was a man of conscience,” he concedes, and that pretty much undermines the whole case.

But back to that line about tweeting the 95 theses. It is perhaps the most plausible claim in the whole piece, but, oddly, the author never elaborates further. I say that it is plausible not only because the theses are relatively short statements – roughly half of them or so could actually be tweeted (in their English translation, anyway) – and one might say that in their day they went viral, but also because it trades on an influential myth that continues to inform how many Protestants view technology.

The myth, briefly stated in intentionally anachronistic terms, runs something like this. Marin Luther’s success was owed to his visionary embrace of a cutting edge media technology, the printing press. While the Catholic church reacted with a moral panic about the religious and social consequences of easily accessible information and their inability to control it, Luther and his followers understood that information wanted to be free and institutions needed to be disrupted. And history testifies to the rightness of Luther’s attitude toward new technology.

In calling this story a myth, I don’t mean to suggest that it is altogether untrue. While the full account is more complicated, it is nonetheless true that Luther did embrace printing and appears to have understood its power. Indeed, under Luther’s auspices Wittenberg, an otherwise unremarkable university town, became one of the leading centers of printing in Europe. A good account of these matters can be found in Andrew Pettegree’s Brand Luther. “After Luther, print and public communication would never be the same again,” Pettegree rightly concludes. And it is probably safe to also conclude that apart from printing the Reformation does not happen.

Instead, I use the word myth to mean a story, particularly of a story of origins, which takes on a powerful explanatory and normative role in the life of a tradition or community. It is in this sense that we might speak of the Luther Tech Myth.

The problem with this myth is simple: it sanctions, indeed it encourages uncritical and unreflective adoption of technology. I might add that it also heightens the plausibility of Borg Complex claims: “churches* that do not adapt and adopt to new media will not survive,” etc.

For those who subscribe to the myth, intentionally or tacitly, this is not really a problem because the myth sustains and is sustained by certain unspoken premises regarding the nature of technology, particularly media technology: chiefly, that it is fundamentally neutral. They imagine that new media merely propagate the same message only more effectively. It rarely occurs to them that new media may transform the message in a subtle but not inconsequential manner and that new media may smuggle another sort of message (or, effect) with it, and that these may reconfigure the nature of the community, the practices of piety, and the content of the faith in ways they did not anticipate.

Let’s get back to Luther for a moment and take a closer look at the relationship between printing and Protestantism.

In The Reformation: A History, Oxford historian Diarmaid MacCulloch makes some instructive observations about printing. What is most notable about MacCulloch’s discussion is that it deals with the preparatory effect of printing in the years leading up to 1517. For example, citing historian Bernard Cottret, MacCulloch speaks of “the increase in Bibles [in the half century prior to 1517] created the Reformation rather than being created by it.” A thesis that will certainly surprise many Protestants today, if there are any left. (More on that last, seemingly absurd clause shortly.)

A little further on, MacCulloch correctly observes that the “effect of printing was more profound than simply making more books available more quickly.” For one thing, it “affected western Europe’s assumptions about knowledge and originality of thought.” Manuscript culture is “conscious of the fragility knowledge, and the need to preserve it,” fostering “an attitude that guards rather than spreads knowledge.” Manuscript culture is thus cautious, conservative, and pessimistic. On the other hand, the propensity toward decay is “much less obvious in the print medium: Optimism may be the mood rather than pessimism.” (A point on which MacCulloch cites the pioneering work of Elizabeth Eisenstein.) In other words, printing fostered a more daring cultural spirit that was conducive to the outbreak of a revolutionary movement of reform.

Finally, printing had already made it possible for reading to become “a more prominent part of religion for the laity.” Again, MacCulloch is not talking about the consequences of the Reformation; he is talking about the half century or so leading up to Luther’s break with Rome. Where reading became a more prominent feature of personal piety, “a more inward-looking, personalized devotion,” which is to say, anachronistically, a more characteristically Protestant devotion, emerged. “For someone who really delighted in reading,” MacCulloch adds, “religion might retreat out of the sphere of public ritual into the world of the mind and the imagination.”

“So,” MacCulloch concludes, “without any hint of doctrinal deviation, a new style of piety arose in that increasingly large section of society that valued book learning for both profit and pleasure.” This increasingly large section of the population “would form a ready audience for the Protestant message, with its contempt for so much of the old ritual of worship and devotion.”

All of this, then, is to say that Protestantism is as much an effect of the technology of printing as it is a movement that seized upon the new technology to spread its message. (I suspect, as an aside, that this story, which is obviously more complicated than the sketch I’m providing here would be an important element in Alan Jacobs’ project of uncovering the technological history of modernity.)

A few more thoughts before we wrap up, bear with me. Let’s consider the idea of “a new style of piety,” which preceded and sustained momentous doctrinal and ecclesial developments. This phrase is useful in so much as it is pairs nicely with the old maxim: Lex orandi, lex credendi (the law of prayer is the law belief). The idea is that as the church worships so it believes, or that in some sense worship precedes and constitutes belief. To put it another way, we might say that the worship of the church constitutes the plausibility structures of its faith. To speak of a “new style of piety,” then, is to speak of a set practices for worship, both in its communal forms and in its private forms. These new practices are, accordingly, a new form of worship that may potentially re-configure the church’s faith. This is important to our discussion insofar as practices of worship have a critical material/technological dimension. Bottom line: shifts in the material/technological artifacts and conditions of worship potentially restructure the form and practices of worship, which in turn may potentially reconfigure what is believed.

Of course, it is not only a matter of how print prepares the ground for Protestantism, it is also a matter of how Protestantism evolves in tandem with print. Protestantism is a religion of the book. Its piety is centered on the book; the sacred text, of course, but also the tide of books that become aides to spirituality, displacing icons, crucifixes, statues, relics, and the panoply of ritual gestures that enlisted the body in the service of spiritual formation. The pastor-scholar becomes the model minister. Faith becomes both a more individual affair and a more private matter. On the whole, it takes on a more intellectualist cast. Its devotion is centered more on correct belief rather than veneration. Its instruction is traditionally catechetical. Etc.

This brings us back to the Luther Tech Myth and whether or not there are any Protestants left. The myth is misleading because it oversimplifies a more complicated history, and the oversimplification obscures the degree to which new media technology is not neutral but rather formative.

Henry Jenkins has made an observation that I come back to frequently: “I often tell students that the history of new media has been shaped again and again by four key innovative groups — evangelists, pornographers, advertisers, and politicians, each of whom is constantly looking for new ways to interface with their public.”

The evangelists Jenkins refers to are evangelical Christians in the United States, who are descended from Luther and his fellow reformers. Jenkins is right. Evangelicals have been, as a rule, quick to adopt and adapt new media technologies to spread their message. In doing so, however, they have also been transformed by the tools they have implemented and deployed, from radio to television to the Internet. The reason for this is simple: new styles of piety that arise from new media generate new assumptions about community and authority and charisma (in the theological and sociological sense), and they alter the status and content of belief.

And for this reason traditional Protestantism is an endangered species. Even within theologically conservative branches of American Protestantism, it is rare to find the practice of traditional forms of Protestant piety. Naturally, this should not necessarily be read as a lament. It is, rather, an argument about the consequences of technological change and an encouragement to think more carefully about the adoption and implementation of new technology.

 

______________________________________________________

*  I hesitate to add mosques and synagogues only because I do not believe myself to be sufficiently informed to do so and also because they are obviously not within the traditions shaped by the life and work of Martin Luther. Jewish and Muslim readers, please feel free to add your perspectives about attitudes to technology in your communities in the comments below.


If you’ve appreciated what you’ve read, consider supporting the writer.

A Form of Madness

Bertrand Russell (A History of Western Philosophy) adumbrating Ellul on technology:

Meanwhile science as technique was building up in practical men a quite different outlook from any that was to be found among theoretical philosophers. Technique conferred a sense of power: man is now much less at the mercy of his environment than he was in former times. But the power conferred by technique is social, not individual; an average individual wrecked on a desert island could have achieved more in the seventeenth century than he could now. Scientific technique requires the cooperation of a large number of individuals organized under a single direction. Its tendency, therefore, is against anarchism and even individualism, since it demands a well-knit social structure. Unlike religion, it is ethically neutral: it assures men that they can perform wonders but does not tell them what wonders to perform. In this way it is incomplete. In practice, the purposes to which scientific skill will be devoted depend largely on chance. The men at the head of the vast organizations which it necessitates can, within limits, turn it this way or that as they please. The power impulse thus has a scope which it never had before. The philosophies that have been inspired by scientific technique are power philosophies, and tend to regard everything non-human as mere raw material. Ends are no longer considered; only the skillfulness of the process is valued. This also is a form of madness. It is, in our day, the most dangerous form, and the one against which a sane philosophy should provide an antidote.”

At least one quibble: technology/technique is not ethically neutral, in part precisely because it is not unlike religion.

How to Think About Memory and Technology

I suppose it is the case that we derive some pleasure from imagining ourselves to be part of a beleaguered but noble minority. This may explain why a techno-enthusiast finds it necessary to attack dystopian science fiction on the grounds that it is making us all fear technology while I find that same notion ludicrous.

Likewise, Salma Noreen closes her discussion of the internet’s effect on memory with the following counsel: “Rather than worrying about what we have lost, perhaps we need to focus on what we have gained.” I find that a curious note on which to close because I tend to think that we are not sufficiently concerned about what we have lost or what we may be losing as we steam full speed ahead into our technological futures. But perhaps I also am not immune to the consolations of belonging to an imagined beleaguered community of my own.

So which is it? Are we a society of techno-skeptics with brave, intrepid techno-enthusiasts on the fringes stiffening our resolve to embrace the happy technological future that can be ours for the taking? Or are we a society of techno-enthusiasts for whom the warnings of the few techno-skeptics are nothing more than a distant echo from an ever-receding past?

I suspect the latter is closer to the truth, but you can tell me how things look from where you’re standing.

My main concern is to look more closely at Noreen’s discussion of memory, which is a topic of abiding interest to me. “What anthropologists distinguish as ‘cultures,’” Ivan Illich wrote, “the historian of mental spaces might distinguish as different ‘memories.'” And I rather think he was right. Along similar lines, and in the early 1970s, George Steiner lamented, “The catastrophic decline of memorization in our own modern education and adult resources is one of the crucial, though as yet little understood, symptoms of an afterculture.” We’ll come back to more of what Steiner had to say a bit further on, but first let’s consider Noreen’s article.

She mentions two studies as a foil to her eventual conclusion. The first suggesting “the internet is leading to ‘digital amnesia’, where individuals are no longer able to retain information as a result of storing information on a digital device,” and the other “that relying on digital devices to remember information is impairing our own memory systems.”

“But,” Noreen counsels her readers, “before we mourn this apparent loss of memory, more recent studies suggest that we may be adapting.” And in what, exactly, does this adaptation consist? Noreen summarizes it this way: “Technology has changed the way we organise information so that we only remember details which are no longer available, and prioritise the location of information over the content itself.”

This conclusion seems to me banal, which is not to say that it is incorrect. It amounts to saying that we will not remember what we do not believe we need to remember and that, when we have outsourced our memory, we will take some care to learn how we might access it in the future.

Of course, when the Google myth dominates a society, will we believe that there is anything at all that we ought to commit to memory? The Google myth in this case is the belief that the every conceivable bit of knowledge that we could ever possibly desire is just a Google search away.

The sort of analysis Noreen offers, which is not uncommon, is based on an assumption we should examine more closely and also leaves a critical consideration unaddressed.

The assumption is that there are no distinctions within the category of memory. All memories are assumed to be discreet facts of the sort which one would need to know in order to do well on Jeopardy. But this assumption ignores the diversity of what we call memories and the diversity of functions to which memory is put. Here is how I framed the matter some years back:

All of this leads me to ask, What assumptions are at play that make it immediately plausible for so many to believe that we can move from internalized memory to externalized memory without remainder?  It would seem, at least, that the ground was prepared by an earlier reduction of knowledge to information or data.  Only when we view knowledge as the mere aggregation of discreet bits of data, can we then believe that it makes little difference whether that data is stored in the mind or in a database.

We seem to be approaching knowledge as if life were a game of Jeopardy which is played well by merely being able to access trivial knowledge at random.  What is lost is the associational dimension of knowledge which constructs meaning and understanding by relating one thing to another and not merely by aggregating data.  This form of knowledge, which we might call metaphorical or analogical, allows us to experience life with the ability to “understand in light of,” to perceive through a rich store of knowledge and experience that allows us to see and make connections that richly texture and layer our experience of reality.

But this understanding of memory seems largely absent from the sorts of studies that are frequently cited in discussions of offloaded or outsourced memory. I’ll add another relevant consideration I’ve previously articulated, that there is a silent equivocation that slips into these discussions: the notion of memory we tend to assume is our current understanding of memory derived by comparison to computer memory, which is essentially storage.

Having first identified a computer’s storage  capacity as “memory,” a metaphor dependent upon the human capacity we call “memory,” we have now come to reverse the direction of the metaphor by understanding human “memory” in light of a computer’s storage capacity.  In other words we’ve reduced our understanding of memory to mere storage of information.  And now we read all discussions of memory in light of this reductive understanding.

As for the unaddressed critical consideration, if we grant that we must all outsource or externalize some of our memory, and that it may even be admittedly advantageous to do so, how do we make qualitative judgments about the memory that we can outsource to our benefit and the memory we should on principle internalize (if we even allow for the latter possibility)?

Here we might take a cue from the religious practices of Jews, Christians, and Muslims, who have long made the memorization of Scripture a central component of their respective forms of piety. Here’s a bit more from Steiner commenting on what can be known about early modern literacy:

Scriptural and, in a wider sense, religious literacy ran strong, particularly in Protestant lands. The Authorized Version and Luther’s Bible carried in their wake a rich tradition of symbolic, allusive, and syntactic awareness. Absorbed in childhood, the Book of Common Prayer, the Lutheran hymnal and psalmody cannot but have marked a broad compass of mental life with their exact, stylized articulateness and music of thought. Habits of communication and schooling, moreover, sprang directly from the concentration of memory. So much was learned and known by heart — a term beautifully apposite to the organic, inward presentness of meaning and spoken being within the individual spirit.

Learned by heart–a beautifully apt phrase, indeed. Interestingly, this is an aspect of religious practice that, while remaining relatively consistent across the transition from oral to literate society, appears to be succumbing to the pressures of the Google myth, at least among Protestants. If I have an app that lets me instantly access any passage of my sacred text, in any of a hundred different translations, why would I bother to memorize any of it.

The answer, of course, best and perhaps only learned by personal experience, is that there is a qualitative difference between the “organic, inward presentness of meaning” that Steiner describes and merely knowing that I know how to find a text if I were inclined to find it. But the Google myth, and the studies that examine it, seem to know nothing of that qualitative difference, or, at least, they choose to bracket it.

I should note in passing that much of what I have recently written about attention is also relevant here. Distraction is the natural state of someone who has no goal that might otherwise command or direct their attention. Likewise, forgetfulness is the natural state of someone who has no compelling reason to commit something to memory. At the heart of both states may be the liberated individual will yielded by modernity. Distraction and forgetfulness seem both to stem from a refusal to acknowledge an order of knowing that is outside of and independent of the solitary self. To discipline our attention and to learn something by heart is, in no small measure, to submit the self to something beyond its own whims and prerogatives.

So, then, we might say that one of the enduring consequences of new forms of externalized memory is not only that they alter the quantity of what is committed to memory but that they also reconfigure the meaning and value that we assign to both the work of remembering and to what is remembered. In this way we begin to see why Illich believed that changing memories amounted to changing cultures. This is also why we should consider that Plato’s Socrates was on to something more than critics give him credit for when he criticized writing for how it would affect memory, which was for Plato much more than merely the ability to recall discreet bits of data.

This last point brings me, finally, to an excellent discussion of these matters by John Danaher. Danaher is always clear and meticulous in his writing and I commend his blog, Philosophical Disquisitions, to you. In this post, he explores the externalization of memory via a discussion of a helpful distinction offered by David Krakauer of the Santa Fe Institute. Here is Danaher’s summary of the distinction between two different types of cognitive artifacts, or artifacts we think with:

Complementary Cognitive Artifacts: These are artifacts that complement human intelligence in such a way that their use amplifies and improves our ability to perform cognitive tasks and once the user has mastered the physical artifact they can use a virtual/mental equivalent to perform the same cognitive task at a similar level of skill, e.g. an abacus.

Competitive Cognitive Artifacts: These are artifacts that amplify and improve our abilities to perform cognitive tasks when we have use of the artifact but when we take away the artifact we are no better (and possibly worse) at performing the cognitive task than we were before.

Danaher critically interacts with Krakauer’s distinction, but finds it useful. It is useful because, like Albert Borgmann’s work, it offers to us concepts and categories by which we might begin to evaluate the sorts of trade-offs we must make when deciding what technologies we will use and how.

Also of interest is Danaher’s discussion of cognitive ecology. Invoking earlier work by Donald Norman, Danaher explains that “competitive cognitive artifacts don’t just replace or undermine one cognitive task. They change the cognitive ecology, i.e. the social and physical environment in which we must perform cognitive tasks.” His critical consideration of the concept of cognitive ecology brings him around to the wonderful work Evan Selinger has been doing on the problem of technological outsourcing, work that I’ve cited here on more than a few occasions. I commend to you Danaher’s post for both its content and its method. It will be more useful to you than the vast majority of commentary you might otherwise encounter on this subject.

I’ll leave you with the following observation by the filmmaker Luis Bunuel: “Our memory is our coherence, our reason, our feeling, even our action. Without it, we are nothing.” Let us take some care and give some thought, then, to how our tools shape our remembering.

 

Attention and the Moral Life

I’ve continued to think about a question raised by Frank Furedi in an otherwise lackluster essay about distraction and digital devices. Furedi set out to debunk the claim that digital devices are undermining our attention and our memory. I don’t think he succeeded, but he left us with a question worth considering: “The question that is rarely posed by advocates of the distraction thesis is: what are people distracted from?”

In an earlier post, I suggested that this question can be usefully set alongside a mid-20th century observation by Hannah Arendt. Considering the advent of automation, Arendt feared “the prospect of a society of laborers without labor, that is, without the only activity left to them.” “Surely, nothing could be worse,” she added.

The connection might not have been as clear as I imagined it, so let me explain. Arendt believed that labor is the “only activity left” to the laborer because the glorification of labor in modern society had eclipsed the older ends and goods to which labor had been subordinated and for the sake of which we might have sought freedom from labor.

To put it as directly as I can, Arendt believed that if we indeed found ourselves liberated from the need to labor, we would not know what to do with ourselves. We would not know what to do with ourselves because, in the modern world, laboring had become the ordering principle of our lives.

Recalling Arendt’s fear, I wondered whether we were not in a similar situation with regards to attention. If we were able to successfully challenge the regime of digital distraction, to what would we give the attention that we would have fought so hard to achieve? Would we be like the laborers in Arendt’s analysis, finally free but without anything to do with our freedom? I wondered, as well, if it were not harder to combat distraction, if we were inclined to do so, precisely because we had no telos for the sake of which we might undertake the struggle.

Interestingly, then, while the link between Arendt’s comments about labor and the question about the purpose of attention was initially only suggestive, I soon realized the two were more closely connected. They were connected by the idea of leisure.

We tend to think of leisure merely as an occasional break from work. That is not, however, how leisure was understood in either classical or medieval culture. Josef Pieper, a Catholic philosopher and theologian, was thinking about the cultural ascendency of labor or work and the eclipse of leisure around the same time that Arendt was articulating her fears of a society of laborers without labor. In many respects, their analysis overlaps. (I should note, though, that Arendt distinguishes between labor and work in way that Pieper does not. Work for Pieper is roughly analogous to labor in Arendt’s taxonomy.)

For her part, Arendt believed nothing could be worse than liberating laborers from labor at this stage in our cultural evolution, and this is why:

“The modern age has carried with it a theoretical glorification of labor and has resulted in a factual transformation of the whole of society into a laboring society.  The fulfillment of the wish, therefore, like the fulfillment of wishes in fairy tales, comes at a moment when it can only be self-defeating. It is a society of laborers which is about to be liberated from the fetters of labor, and this society does no longer know of those other higher and more meaningful activities for the sake of which this freedom would deserve to be won. Within this society, which is egalitarian because this is labor’s way of making men live together, there is no class left, no aristocracy of either a political or spiritual nature from which a restoration of the other capacities of man could start anew.”

To say that there is “no aristocracy of either a political or spiritual nature” is another way of saying that there is no leisured class in the older sense of the word. This older ideal of leisure did not entail freedom from labor for the sake of endless poolside lounging while sipping Coronas. It was freedom from labor for the sake of intellectual, political, moral, or spiritual aims, the achievement of which may very well require arduous discipline. We might say that it was freedom from the work of the body that made it possible for someone to take up the work of the soul or the mind. Thus Pieper can claim that leisure is “a condition of the soul.” But, we should also note, it was not necessarily a solitary endeavor, or, better, it was not an endeavor that had only the good of the individual in mind. It often involved service to the political or spiritual community.

Pieper further defines leisure as “a form of that stillness that is the necessary preparation for accepting reality; only the person who is still can hear, and whoever is not still cannot hear.” He makes clear, though, that the stillness he has in mind “is not mere soundlessness or a dead muteness; it means, rather, that the soul’s power, as real, of responding to the real – a co-respondence, eternally established in nature – has not yet descended into words.” Thus, leisure “is the disposition of receptive understanding, of contemplative beholding, and immersion – in the real.”

Pieper also claims that leisure “is only possible on the assumption that man is not only in harmony with himself, whereas idleness is rooted in the denial of this harmony, but also that he is in agreement with the world and its meaning. Leisure lives on affirmation.” The passing comment on idleness is especially useful to us.

In our view, leisure and idleness are nearly indistinguishable. But on the older view, idleness is not leisure; indeed, it is the enemy of leisure. Idleness, on the older view, may even take the shape of frenzied activity undertaken for the sake of, yes, distracting us from the absence of harmony or agreement with ourselves and the world.

We are now inevitably within the orbit of Blaise Pascal’s analysis of the restlessness of the human condition. Because we are not at peace with ourselves or our world, we crave distraction or what he called diversions. “What people want,” Pascal insists, “is not the easy peaceful life that allows us to think of our unhappy condition, nor the dangers of war, nor the burdens of office, but the agitation that takes our mind off it and diverts us.” “Nothing could be more wretched,” Pascal added, “than to be intolerably depressed as soon as one is reduced to introspection with no means of diversion.”

The novelist Walker Percy, a younger contemporary of both Arendt and Pieper, described what we called the “diverted self” as follows: “In a free and affluent society, the self is free to divert itself endlessly from itself.  It works in order to enjoy the diversions that the fruit of one’s labor can purchase.”  For the diverted self, Percy concluded, “The pursuit of happiness becomes the pursuit of diversion.”

If leisure is a condition of the soul as Pieper would have it, then might we also say the same of distraction? Discreet instances of being distracted, of failing to meaningfully direct our attention, would then be symptoms of a deeper disorder. Our digital devices, in this framing of distraction, are both a material cause and an effect. The absence of digital devices would not cure us of the underlying distractedness or aimlessness, but their presence preys upon, exacerbates, and amplifies this inner distractedness.

It is hard, at this point, for me not to feel that I have been speaking in another language or at least another dialect, one whose cadences and lexical peculiarities are foreign to our own idiom and, consequently, to our way of making sense of our experience. Leisure, idleness, contemplative beholding, spiritual and political aristocracies–all of this recalls to mind Alasdair MacIntyre’s observation that we use such words in much the same way that a post-apocalyptic society, picking up the scattered pieces of the modern scientific enterprise would use “neutrino,” “mass,” and “specific gravity”: not entirely without meaning, perhaps, but certainly not as scientists. The language I’ve employed, likewise, is the language of an older moral vision, a moral vision that we have lost.

I’m not suggesting that we ought to seek to recover the fullness of the language or the world that gave it meaning. That would not be possible, of course. But what if we, nonetheless, desired to bring a measure of order to the condition of distraction that we might experience as an affliction? What if we sought some telos to direct and sustain our attention, to at least buffer us from the forces of distraction?

If such is the case, I commend to you Simone Weil’s reflections on attention and will. Believing that the skill of paying attention cultivated in one domain was transferable to another, Weil went so far as to claim that the cultivation of attention was the real goal of education: “Although people seem to be unaware of it today, the development of the faculty of attention forms the real object and almost the sole interest of studies.”

It was Weil who wrote, “Attention is the rarest and purest form of generosity.” A beautiful sentiment grounded in a deeply moral understanding of attention. Attention, for Weil, was not merely an intellectual asset, what we require for the sake of reading long, dense novels. Rather, for Weil, attention appears to be something foundational to the moral life:

“There is something in our soul that loathes true attention much more violently than flesh loathes fatigue. That something is much closer to evil than flesh is. That is why, every time we truly give our attention, we destroy some evil in ourselves.”

Ultimately, Weil understood attention to be a critical component of the religious life as well. “Attention, taken to its highest degree,” Weil wrote, “is the same thing as prayer. It presupposes faith and love.” “If we turn our mind toward the good,” she added, “it is impossible that little by little the whole soul will not be attracted thereto in spite of itself.” And this is because, in her view, “We have to try to cure our faults by attention and not by will.”

So here we have, if we wanted it, something to animate our desire to discipline the distracted self, something at which to direct our attention. Weil’s counsel was echoed closer to our own time by David Foster Wallace, who also located the goal of education in the cultivation of attention.

“Learning how to think really means learning how to exercise some control over how and what you think,” Wallace explained in his now famous commencement address at Kenyon College. “It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience.”

“The really important kind of freedom,” Wallace added, “involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day. That is real freedom. That is being educated, and understanding how to think.” Each day the truth of this claim impresses itself more and more deeply upon my mind and heart.

Finally, and briefly, we should be wary of imagining the work of cultivating attention as merely a matter of learning how to consciously choose what we will attend to at any given moment. That is part of it to be sure, but Weil and Pieper both knew that attention also involved an openness to what is, a capacity to experience the world as gift. Cultivating our attention in this sense is not a matter of focusing upon an object of attention for our own reasons, however noble those may be. It is also a matter of setting to one side our projects and aspirations that we might be surprised by what is there. “We do not obtain the most precious gifts by going in search of them,” Weil wrote, “but by waiting for them.” In this way, we prepare for “some dim dazzling trick of grace,” to borrow a felicitous phrase from Walker Percy, that may illumine our minds and enliven our hearts.

It is these considerations, then, that I would offer in response to Furedi’s question, What are we distracted from?