How to Think About Memory and Technology

I suppose it is the case that we derive some pleasure from imagining ourselves to be part of a beleaguered but noble minority. This may explain why a techno-enthusiast finds it necessary to attack dystopian science fiction on the grounds that it is making us all fear technology while I find that same notion ludicrous.

Likewise, Salma Noreen closes her discussion of the internet’s effect on memory with the following counsel: “Rather than worrying about what we have lost, perhaps we need to focus on what we have gained.” I find that a curious note on which to close because I tend to think that we are not sufficiently concerned about what we have lost or what we may be losing as we steam full speed ahead into our technological futures. But perhaps I also am not immune to the consolations of belonging to an imagined beleaguered community of my own.

So which is it? Are we a society of techno-skeptics with brave, intrepid techno-enthusiasts on the fringes stiffening our resolve to embrace the happy technological future that can be ours for the taking? Or are we a society of techno-enthusiasts for whom the warnings of the few techno-skeptics are nothing more than a distant echo from an ever-receding past?

I suspect the latter is closer to the truth, but you can tell me how things look from where you’re standing.

My main concern is to look more closely at Noreen’s discussion of memory, which is a topic of abiding interest to me. “What anthropologists distinguish as ‘cultures,’” Ivan Illich wrote, “the historian of mental spaces might distinguish as different ‘memories.'” And I rather think he was right. Along similar lines, and in the early 1970s, George Steiner lamented, “The catastrophic decline of memorization in our own modern education and adult resources is one of the crucial, though as yet little understood, symptoms of an afterculture.” We’ll come back to more of what Steiner had to say a bit further on, but first let’s consider Noreen’s article.

She mentions two studies as a foil to her eventual conclusion. The first suggesting “the internet is leading to ‘digital amnesia’, where individuals are no longer able to retain information as a result of storing information on a digital device,” and the other “that relying on digital devices to remember information is impairing our own memory systems.”

“But,” Noreen counsels her readers, “before we mourn this apparent loss of memory, more recent studies suggest that we may be adapting.” And in what, exactly, does this adaptation consist? Noreen summarizes it this way: “Technology has changed the way we organise information so that we only remember details which are no longer available, and prioritise the location of information over the content itself.”

This conclusion seems to me banal, which is not to say that it is incorrect. It amounts to saying that we will not remember what we do not believe we need to remember and that, when we have outsourced our memory, we will take some care to learn how we might access it in the future.

Of course, when the Google myth dominates a society, will we believe that there is anything at all that we ought to commit to memory? The Google myth in this case is the belief that the every conceivable bit of knowledge that we could ever possibly desire is just a Google search away.

The sort of analysis Noreen offers, which is not uncommon, is based on an assumption we should examine more closely and also leaves a critical consideration unaddressed.

The assumption is that there are no distinctions within the category of memory. All memories are assumed to be discreet facts of the sort which one would need to know in order to do well on Jeopardy. But this assumption ignores the diversity of what we call memories and the diversity of functions to which memory is put. Here is how I framed the matter some years back:

All of this leads me to ask, What assumptions are at play that make it immediately plausible for so many to believe that we can move from internalized memory to externalized memory without remainder?  It would seem, at least, that the ground was prepared by an earlier reduction of knowledge to information or data.  Only when we view knowledge as the mere aggregation of discreet bits of data, can we then believe that it makes little difference whether that data is stored in the mind or in a database.

We seem to be approaching knowledge as if life were a game of Jeopardy which is played well by merely being able to access trivial knowledge at random.  What is lost is the associational dimension of knowledge which constructs meaning and understanding by relating one thing to another and not merely by aggregating data.  This form of knowledge, which we might call metaphorical or analogical, allows us to experience life with the ability to “understand in light of,” to perceive through a rich store of knowledge and experience that allows us to see and make connections that richly texture and layer our experience of reality.

But this understanding of memory seems largely absent from the sorts of studies that are frequently cited in discussions of offloaded or outsourced memory. I’ll add another relevant consideration I’ve previously articulated, that there is a silent equivocation that slips into these discussions: the notion of memory we tend to assume is our current understanding of memory derived by comparison to computer memory, which is essentially storage.

Having first identified a computer’s storage  capacity as “memory,” a metaphor dependent upon the human capacity we call “memory,” we have now come to reverse the direction of the metaphor by understanding human “memory” in light of a computer’s storage capacity.  In other words we’ve reduced our understanding of memory to mere storage of information.  And now we read all discussions of memory in light of this reductive understanding.

As for the unaddressed critical consideration, if we grant that we must all outsource or externalize some of our memory, and that it may even be admittedly advantageous to do so, how do we make qualitative judgments about the memory that we can outsource to our benefit and the memory we should on principle internalize (if we even allow for the latter possibility)?

Here we might take a cue from the religious practices of Jews, Christians, and Muslims, who have long made the memorization of Scripture a central component of their respective forms of piety. Here’s a bit more from Steiner commenting on what can be known about early modern literacy:

Scriptural and, in a wider sense, religious literacy ran strong, particularly in Protestant lands. The Authorized Version and Luther’s Bible carried in their wake a rich tradition of symbolic, allusive, and syntactic awareness. Absorbed in childhood, the Book of Common Prayer, the Lutheran hymnal and psalmody cannot but have marked a broad compass of mental life with their exact, stylized articulateness and music of thought. Habits of communication and schooling, moreover, sprang directly from the concentration of memory. So much was learned and known by heart — a term beautifully apposite to the organic, inward presentness of meaning and spoken being within the individual spirit.

Learned by heart–a beautifully apt phrase, indeed. Interestingly, this is an aspect of religious practice that, while remaining relatively consistent across the transition from oral to literate society, appears to be succumbing to the pressures of the Google myth, at least among Protestants. If I have an app that lets me instantly access any passage of my sacred text, in any of a hundred different translations, why would I bother to memorize any of it.

The answer, of course, best and perhaps only learned by personal experience, is that there is a qualitative difference between the “organic, inward presentness of meaning” that Steiner describes and merely knowing that I know how to find a text if I were inclined to find it. But the Google myth, and the studies that examine it, seem to know nothing of that qualitative difference, or, at least, they choose to bracket it.

I should note in passing that much of what I have recently written about attention is also relevant here. Distraction is the natural state of someone who has no goal that might otherwise command or direct their attention. Likewise, forgetfulness is the natural state of someone who has no compelling reason to commit something to memory. At the heart of both states may be the liberated individual will yielded by modernity. Distraction and forgetfulness seem both to stem from a refusal to acknowledge an order of knowing that is outside of and independent of the solitary self. To discipline our attention and to learn something by heart is, in no small measure, to submit the self to something beyond its own whims and prerogatives.

So, then, we might say that one of the enduring consequences of new forms of externalized memory is not only that they alter the quantity of what is committed to memory but that they also reconfigure the meaning and value that we assign to both the work of remembering and to what is remembered. In this way we begin to see why Illich believed that changing memories amounted to changing cultures. This is also why we should consider that Plato’s Socrates was on to something more than critics give him credit for when he criticized writing for how it would affect memory, which was for Plato much more than merely the ability to recall discreet bits of data.

This last point brings me, finally, to an excellent discussion of these matters by John Danaher. Danaher is always clear and meticulous in his writing and I commend his blog, Philosophical Disquisitions, to you. In this post, he explores the externalization of memory via a discussion of a helpful distinction offered by David Krakauer of the Santa Fe Institute. Here is Danaher’s summary of the distinction between two different types of cognitive artifacts, or artifacts we think with:

Complementary Cognitive Artifacts: These are artifacts that complement human intelligence in such a way that their use amplifies and improves our ability to perform cognitive tasks and once the user has mastered the physical artifact they can use a virtual/mental equivalent to perform the same cognitive task at a similar level of skill, e.g. an abacus.

Competitive Cognitive Artifacts: These are artifacts that amplify and improve our abilities to perform cognitive tasks when we have use of the artifact but when we take away the artifact we are no better (and possibly worse) at performing the cognitive task than we were before.

Danaher critically interacts with Krakauer’s distinction, but finds it useful. It is useful because, like Albert Borgmann’s work, it offers to us concepts and categories by which we might begin to evaluate the sorts of trade-offs we must make when deciding what technologies we will use and how.

Also of interest is Danaher’s discussion of cognitive ecology. Invoking earlier work by Donald Norman, Danaher explains that “competitive cognitive artifacts don’t just replace or undermine one cognitive task. They change the cognitive ecology, i.e. the social and physical environment in which we must perform cognitive tasks.” His critical consideration of the concept of cognitive ecology brings him around to the wonderful work Evan Selinger has been doing on the problem of technological outsourcing, work that I’ve cited here on more than a few occasions. I commend to you Danaher’s post for both its content and its method. It will be more useful to you than the vast majority of commentary you might otherwise encounter on this subject.

I’ll leave you with the following observation by the filmmaker Luis Bunuel: “Our memory is our coherence, our reason, our feeling, even our action. Without it, we are nothing.” Let us take some care and give some thought, then, to how our tools shape our remembering.

 

Attention and the Moral Life

I’ve continued to think about a question raised by Frank Furedi in an otherwise lackluster essay about distraction and digital devices. Furedi set out to debunk the claim that digital devices are undermining our attention and our memory. I don’t think he succeeded, but he left us with a question worth considering: “The question that is rarely posed by advocates of the distraction thesis is: what are people distracted from?”

In an earlier post, I suggested that this question can be usefully set alongside a mid-20th century observation by Hannah Arendt. Considering the advent of automation, Arendt feared “the prospect of a society of laborers without labor, that is, without the only activity left to them.” “Surely, nothing could be worse,” she added.

The connection might not have been as clear as I imagined it, so let me explain. Arendt believed that labor is the “only activity left” to the laborer because the glorification of labor in modern society had eclipsed the older ends and goods to which labor had been subordinated and for the sake of which we might have sought freedom from labor.

To put it as directly as I can, Arendt believed that if we indeed found ourselves liberated from the need to labor, we would not know what to do with ourselves. We would not know what to do with ourselves because, in the modern world, laboring had become the ordering principle of our lives.

Recalling Arendt’s fear, I wondered whether we were not in a similar situation with regards to attention. If we were able to successfully challenge the regime of digital distraction, to what would we give the attention that we would have fought so hard to achieve? Would we be like the laborers in Arendt’s analysis, finally free but without anything to do with our freedom? I wondered, as well, if it were not harder to combat distraction, if we were inclined to do so, precisely because we had no telos for the sake of which we might undertake the struggle.

Interestingly, then, while the link between Arendt’s comments about labor and the question about the purpose of attention was initially only suggestive, I soon realized the two were more closely connected. They were connected by the idea of leisure.

We tend to think of leisure merely as an occasional break from work. That is not, however, how leisure was understood in either classical or medieval culture. Josef Pieper, a Catholic philosopher and theologian, was thinking about the cultural ascendency of labor or work and the eclipse of leisure around the same time that Arendt was articulating her fears of a society of laborers without labor. In many respects, their analysis overlaps. (I should note, though, that Arendt distinguishes between labor and work in way that Pieper does not. Work for Pieper is roughly analogous to labor in Arendt’s taxonomy.)

For her part, Arendt believed nothing could be worse than liberating laborers from labor at this stage in our cultural evolution, and this is why:

“The modern age has carried with it a theoretical glorification of labor and has resulted in a factual transformation of the whole of society into a laboring society.  The fulfillment of the wish, therefore, like the fulfillment of wishes in fairy tales, comes at a moment when it can only be self-defeating. It is a society of laborers which is about to be liberated from the fetters of labor, and this society does no longer know of those other higher and more meaningful activities for the sake of which this freedom would deserve to be won. Within this society, which is egalitarian because this is labor’s way of making men live together, there is no class left, no aristocracy of either a political or spiritual nature from which a restoration of the other capacities of man could start anew.”

To say that there is “no aristocracy of either a political or spiritual nature” is another way of saying that there is no leisured class in the older sense of the word. This older ideal of leisure did not entail freedom from labor for the sake of endless poolside lounging while sipping Coronas. It was freedom from labor for the sake of intellectual, political, moral, or spiritual aims, the achievement of which may very well require arduous discipline. We might say that it was freedom from the work of the body that made it possible for someone to take up the work of the soul or the mind. Thus Pieper can claim that leisure is “a condition of the soul.” But, we should also note, it was not necessarily a solitary endeavor, or, better, it was not an endeavor that had only the good of the individual in mind. It often involved service to the political or spiritual community.

Pieper further defines leisure as “a form of that stillness that is the necessary preparation for accepting reality; only the person who is still can hear, and whoever is not still cannot hear.” He makes clear, though, that the stillness he has in mind “is not mere soundlessness or a dead muteness; it means, rather, that the soul’s power, as real, of responding to the real – a co-respondence, eternally established in nature – has not yet descended into words.” Thus, leisure “is the disposition of receptive understanding, of contemplative beholding, and immersion – in the real.”

Pieper also claims that leisure “is only possible on the assumption that man is not only in harmony with himself, whereas idleness is rooted in the denial of this harmony, but also that he is in agreement with the world and its meaning. Leisure lives on affirmation.” The passing comment on idleness is especially useful to us.

In our view, leisure and idleness are nearly indistinguishable. But on the older view, idleness is not leisure; indeed, it is the enemy of leisure. Idleness, on the older view, may even take the shape of frenzied activity undertaken for the sake of, yes, distracting us from the absence of harmony or agreement with ourselves and the world.

We are now inevitably within the orbit of Blaise Pascal’s analysis of the restlessness of the human condition. Because we are not at peace with ourselves or our world, we crave distraction or what he called diversions. “What people want,” Pascal insists, “is not the easy peaceful life that allows us to think of our unhappy condition, nor the dangers of war, nor the burdens of office, but the agitation that takes our mind off it and diverts us.” “Nothing could be more wretched,” Pascal added, “than to be intolerably depressed as soon as one is reduced to introspection with no means of diversion.”

The novelist Walker Percy, a younger contemporary of both Arendt and Pieper, described what we called the “diverted self” as follows: “In a free and affluent society, the self is free to divert itself endlessly from itself.  It works in order to enjoy the diversions that the fruit of one’s labor can purchase.”  For the diverted self, Percy concluded, “The pursuit of happiness becomes the pursuit of diversion.”

If leisure is a condition of the soul as Pieper would have it, then might we also say the same of distraction? Discreet instances of being distracted, of failing to meaningfully direct our attention, would then be symptoms of a deeper disorder. Our digital devices, in this framing of distraction, are both a material cause and an effect. The absence of digital devices would not cure us of the underlying distractedness or aimlessness, but their presence preys upon, exacerbates, and amplifies this inner distractedness.

It is hard, at this point, for me not to feel that I have been speaking in another language or at least another dialect, one whose cadences and lexical peculiarities are foreign to our own idiom and, consequently, to our way of making sense of our experience. Leisure, idleness, contemplative beholding, spiritual and political aristocracies–all of this recalls to mind Alasdair MacIntyre’s observation that we use such words in much the same way that a post-apocalyptic society, picking up the scattered pieces of the modern scientific enterprise would use “neutrino,” “mass,” and “specific gravity”: not entirely without meaning, perhaps, but certainly not as scientists. The language I’ve employed, likewise, is the language of an older moral vision, a moral vision that we have lost.

I’m not suggesting that we ought to seek to recover the fullness of the language or the world that gave it meaning. That would not be possible, of course. But what if we, nonetheless, desired to bring a measure of order to the condition of distraction that we might experience as an affliction? What if we sought some telos to direct and sustain our attention, to at least buffer us from the forces of distraction?

If such is the case, I commend to you Simone Weil’s reflections on attention and will. Believing that the skill of paying attention cultivated in one domain was transferable to another, Weil went so far as to claim that the cultivation of attention was the real goal of education: “Although people seem to be unaware of it today, the development of the faculty of attention forms the real object and almost the sole interest of studies.”

It was Weil who wrote, “Attention is the rarest and purest form of generosity.” A beautiful sentiment grounded in a deeply moral understanding of attention. Attention, for Weil, was not merely an intellectual asset, what we require for the sake of reading long, dense novels. Rather, for Weil, attention appears to be something foundational to the moral life:

“There is something in our soul that loathes true attention much more violently than flesh loathes fatigue. That something is much closer to evil than flesh is. That is why, every time we truly give our attention, we destroy some evil in ourselves.”

Ultimately, Weil understood attention to be a critical component of the religious life as well. “Attention, taken to its highest degree,” Weil wrote, “is the same thing as prayer. It presupposes faith and love.” “If we turn our mind toward the good,” she added, “it is impossible that little by little the whole soul will not be attracted thereto in spite of itself.” And this is because, in her view, “We have to try to cure our faults by attention and not by will.”

So here we have, if we wanted it, something to animate our desire to discipline the distracted self, something at which to direct our attention. Weil’s counsel was echoed closer to our own time by David Foster Wallace, who also located the goal of education in the cultivation of attention.

“Learning how to think really means learning how to exercise some control over how and what you think,” Wallace explained in his now famous commencement address at Kenyon College. “It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience.”

“The really important kind of freedom,” Wallace added, “involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day. That is real freedom. That is being educated, and understanding how to think.” Each day the truth of this claim impresses itself more and more deeply upon my mind and heart.

Finally, and briefly, we should be wary of imagining the work of cultivating attention as merely a matter of learning how to consciously choose what we will attend to at any given moment. That is part of it to be sure, but Weil and Pieper both knew that attention also involved an openness to what is, a capacity to experience the world as gift. Cultivating our attention in this sense is not a matter of focusing upon an object of attention for our own reasons, however noble those may be. It is also a matter of setting to one side our projects and aspirations that we might be surprised by what is there. “We do not obtain the most precious gifts by going in search of them,” Weil wrote, “but by waiting for them.” In this way, we prepare for “some dim dazzling trick of grace,” to borrow a felicitous phrase from Walker Percy, that may illumine our minds and enliven our hearts.

It is these considerations, then, that I would offer in response to Furedi’s question, What are we distracted from?

The Consolations of a Technologically Re-enchanted World

Navneet Alang writes about digital culture with a rare combination of insight and eloquence. In a characteristically humane meditation on the perennial longings expressed by our use of social media and digital devices, Alang recounts a brief exchange he found himself having with Alexa, the AI assistant that accompanies Amazon Echo.

Alang had asked Alexa about the weather while he was traveling in an unfamiliar city. Alexa alerted him of the forecasted rain, and, without knowing why exactly, Alang thanked the device. “No problem,” Alexa replied.

It was Alang’s subsequent reflection on that exchange that I found especially interesting:

In retrospect, I had what was a very strange reaction: a little jolt of pleasure. Perhaps it was because I had mostly spent those two weeks alone, but Alexa’s response was close enough to the outline of human communication to elicit a feeling of relief in me. For a moment, I felt a little less lonely.

From there, Alang considers apps which allow users to anonymously publish their secrets to the world or to the void–who can tell–and little-used social media sites on which users compose surprisingly revealing messages seemingly directed at no one in particular. A reminder that, as Elizabeth Stoker Bruenig has noted, “Confession, once rooted in religious practice, has assumed a secular importance that can be difficult to describe.”

Part of what makes the effort to understand technology so fascinating and challenging is that we are not, finally, trying to understand discreet artifacts or even expansive systems; what we are really trying to understand is the human condition, alternatively and sometimes simultaneously expressed, constituted, and frustrated by our use of all that we call technology.

As Alang notes near the end of his essay, “what digital technologies do best, to our benefit and detriment, is to act as a canvas for our desires.” And, in his discussion, social media and confessional apps express “a wish to be seen, to be heard, to be apprehended as nothing less than who we imagine ourselves to be.” In the most striking paragraph of the piece, Alang expands on this point:

“Perhaps, then, that Instagram shot or confessional tweet isn’t always meant to evoke some mythical, pretend version of ourselves, but instead seeks to invoke the imagined perfect audience—the non-existent people who will see us exactly as we want to be seen. We are not curating an ideal self, but rather, an ideal Other, a fantasy in which our struggle to become ourselves is met with the utmost empathy.”

This strikes me as being rather near the mark. We might also consider the possibility that we seek this ideal Other precisely so that we might receive back from it a more coherent version of ourselves. The empathetic Other who comes to know me may then tell me what I need to know about myself. A trajectory begins to come into focus taking up both the confessional booth and the therapist’s office. Perhaps this presses the point too far, I don’t know. It is, in any case, a promise implicit in the rhetoric of Big Data, that it is the Other that knows us better than we know ourselves. If, to borrow St. Augustine’s formulation, we have become a question to ourselves, then the purveyors of Big Data proffer to us the answer.

It also strikes me that the yearning Alang describes, in another era, would have been understood chiefly as a deeply religious longing. We may see it as fantasy, or, as C.S. Lewis once put it, we may see it as “the truest index of our real situation.”

Interestingly, the paragraph from which that line is taken may bring us back to where we started: with Alang deriving a “little jolt of pleasure” from his exchange with Alexa. `Here is the rest of it:

“Apparently, then, our lifelong nostalgia, our longing to be reunited with something in the universe from which we now feel cut off, to be on the inside of some door which we have always seen from the outside, is no mere neurotic fancy, but the truest index of our real situation.”

For some time now, I’ve entertained the idea that the combination of technologies that promises to animate our mute and unresponsive material environment–think Internet of Things, autonomous machines, augmented reality, AI–entice us with a re-enchanted world: the human-built world, technologically enchanted. Which is to say a material world that flatters us by appearing to be responsive to our wishes and desires, even speaking to us when spoken to–in short, noting us and thereby marginally assuaging the loneliness for which our social media posts are just another sort of therapy.

Maybe the Kids Aren’t Alright

Consider the following statements regarding the place of digital media in the lives of a cohort of thirteen-year-olds:

“One teenager, Fesse, was usually late – partly because he played Xbox till late into the night ….”

“We witnessed a fair number of struggles to make the technology work, or sometimes to engage pupils with digital media in the classroom.”

“Homework was often accompanied by Facebook, partly as a distraction and partly for summoning help from friends. Some became quickly absorbed in computer games.”

“Adam [played] with people from the online multi-player game in which he could adopt an identity he felt was truly himself.”

“Megan worked on creating her private online space in Tumblr – hours passing by unnoticed.”

“Each found themselves drawn, to varying degrees, into their parents’ efforts to gather as a family, at supper, through shared hobbies, looking after pets, or simply chatting in front of the television – albeit each with phones or tablets at the ready – before peeling off in separate directions.”

“Digital devices and the uses they put them to have become teenagers’ way of asserting their agency – a shield from bossy parents or annoying younger siblings or seemingly critical teachers, a means to connect with sympathetic friends or catching up with ongoing peer ‘drama.'”

Okay, now what would be your initial thoughts about the state of affairs described by these statements? Generally speaking, presented with these observations about the lives of 13-year-olds, I’d think that we might be forgiven a bit of concern. Sure, some of this describes the generally recognizable behavior of “teenagers” writ large, and nothing here suggested life-or-death matters, necessarily, but, nonetheless, it seemed to me that we might wish things were a touch different in some respects. At least, we might want a little more information about how these factors play out over the long run.

But the author framed these statements with these sorts of interpretative comments:

“… the more we know about teenagers’ lives the clearer it becomes that young people are no more interested in being constantly plugged in than are the adults around them.”

“As adults and parents, we might spend less time worrying about what they get up to as teenagers and more time with them, discussing the challenges that lie ahead for them as adults in an increasingly connected world.”

Couple that with the opening paragraph, which begins thus: “With each generation the public consciousness conjures up a new fear for our youth ….” There is no quicker way to signal that you are not at all concerned about something than by leading with “each generation, blah, blah, blah.”

When I first read this piece, I felt a certain dissonance, and I couldn’t quite figure out its source. After thinking about it a bit more, I realized that the dissonance arose from the incongruity between the cheery, “the kids are alright” tone of the article and what the article actually reported.

(I might add that part of my unease also regards methodology. Why would we think that the students were any more transparent with this adult researcher in their midst than they were with the teachers whose halting attempts to connect with them via digital media they hold in apparent contempt? Mind you, this may very well be addressed in a perfectly adequate manner by the author in the book that this article introduces.)

Let me be clear, I’m not calling for what is conventionally and dismissively referred to as a “moral panic.” But I don’t think our only options are “everything is going to hell” and “we live in a digital paradise, quit complaining.” And what is reported in this article suggests to me that we should not be altogether unconcerned about how digital media floods every aspect of our lives and the lives of our children.

To the author’s point that “the more we know about teenagers’ lives the clearer it becomes that young people are no more interested in being constantly plugged in than are the adults around them,” I reply, that’s a damnably low bar and, thus, little comfort.

And when the author preaches “As adults and parents, we might spend less time worrying about what they get up to as teenagers and more time with them, discussing the challenges that lie ahead for them as adults in an increasingly connected world,” I reply, that’s exactly what many adults and parents are trying to do but many of them feel as if they are fighting a losing battle against the very thing you don’t want them to worry about.

One last thought: we are deeply invested in the comforting notion that “the kids are alright,” aren’t we? I’m not saying they are not or that they will not be alright, necessarily. I’m just not sure. Maybe some will and some won’t. Some of the very stories linked by the website to the article in question suggest that there are at least some troubling dimensions to the place of digital media in the lives of teens. I’ve spent the better part of the last fifteen years teaching teens in multiple contexts. In my experience, with a much larger data set mind you, there are indeed reasons to be hopeful, but there are also reasons to be concerned. But never mind that, we really want to believe that they will be just fine regardless.

That desire to believe the “kids are alright” couples all too well with the desire to hold our technology innocent of all wrong. My technological habits are no different, may be they’re worse, so if the kids are alright then so am I. Perhaps the deeper desire underlying these tendencies is the desire to hold ourselves blameless and deflect responsibility for our own actions. If the “kids are alright” no matter what we do or how badly we screw up, then I’ve got nothing to worry about as an adult and a parent. And if the technologies that I’ve allowed to colonize my life and theirs are never, ever to blame, then I can indulge in them to my heart’s content without so much as a twinge of compunction. I get a pass either way, and who doesn’t want that? But maybe the kids are not altogether alright, and maybe it is not altogether their fault but ours.

Finally, one last thought occurred to me. Do we even know what it would mean to be alright anymore? Sometimes I think all we’re aiming at is something like a never-ending and exhausting management of perpetual chaos. Maybe we’ve forgotten how our lives might be alternatively ordered. Maybe our social and cultural context inhibits us from pursuing a better ordered life. Perhaps out of resignation, perhaps for lack of imagination, perhaps because we lack the will, we dare not ask what might be the root causes of our disorders. If we did, we might find that some cherished and unquestioned value, like our own obsession with unbridled individual autonomy, might be complicit. Easier to go on telling ourselves that everything will be alright.

Digital Devices and Learning to Grow Up

Last week the NY Times ran the sort of op-ed on digital culture that the cultured despisers love to ridicule. In it, Jane Brody made a host of claims about the detrimental consequences of digital media consumption on children, especially the very young. She had the temerity, for example, to call texting the “next national epidemic.” Consider as well the following paragraphs:

“Two of my grandsons, ages 10 and 13, seem destined to suffer some of the negative effects of video-game overuse. The 10-year-old gets up half an hour earlier on school days to play computer games, and he and his brother stay plugged into their hand-held devices on the ride to and from school. ‘There’s no conversation anymore,’ said their grandfather, who often picks them up. When the family dines out, the boys use their devices before the meal arrives and as soon as they finish eating.

‘If kids are allowed to play ‘Candy Crush’ on the way to school, the car ride will be quiet, but that’s not what kids need,’ Dr. Steiner-Adair said in an interview. ‘They need time to daydream, deal with anxieties, process their thoughts and share them with parents, who can provide reassurance.’

Technology is a poor substitute for personal interaction.”

Poor lady, I thought, and a grandmother no less. She was in for the kind of thrashing from the digital sophisticates that is usually reserved for Sherry Turkle.

In truth, I didn’t catch too many reactions to the piece, but one did stand out. At The Awl, John Hermann summed up the critical responses with admirable brevity:

“But the argument presented in the first installment is also proudly unsophisticated, and doesn’t attempt to preempt obvious criticism. Lines like ‘technology is a poor substitute for personal interaction,’ and non-sequitur quotes from a grab-bag of experts, tee up the most common and effective response to fears of Screen Addiction: that what’s happening on all these screens is not, as the writer suggests, an endless braindead Candy Crush session, but a rich social experience of its own. That screen is full of friends, and its distraction is no less valuable or valid than the distraction of a room full of buddies or a playground full of fellow students. Screen Addiction is, in this view, nonsensical: you can no more be addicted to a screen than to windows, sounds, or the written word.”

But Hermann does not quite leave it at that: “This is an argument worth making, probably. But tell it to an anxious parent or an alienated grandparent and you will sense that it is inadequate.” The argument may be correct, but, Hermann explains, “Screen Addiction is a generational complaint, and generational complaints, taken individually, are rarely what they claim to be. They are fresh expressions of horrible and timeless anxieties.”

Hermann goes on to make the following poignant observations:

“The grandparent who is persuaded that screens are not destroying human interaction, but are instead new tools for enabling fresh and flawed and modes of human interaction, is left facing a grimmer reality. Your grandchildren don’t look up from their phones because the experiences and friendships they enjoy there seem more interesting than what’s in front of them (you). Those experiences, from the outside, seem insultingly lame: text notifications, Emoji, selfies of other bratty little kids you’ve never met. But they’re urgent and real. What’s different is that they’re also right here, always, even when you thought you had an attentional claim. The moments of social captivity that gave parents power, or that gave grandparents precious access, are now compromised. The TV doesn’t turn off. The friends never go home. The grandkids can do the things they really want to be doing whenever they want, even while they’re sitting five feet away from grandma, alone, in a moving soundproof pod.

To see a more celebratory presentation of these dynamics, recall this Facebook ad from 2013:

Hermann, of course, is less sanguine.

Screen Addiction is a new way for kids to be blithe and oblivious; in this sense, it is empowering to the children, who have been terrible all along. The new grandparent’s dilemma, then, is both real and horribly modern. How, without coming out and saying it, do you tell that kid that you have things you want to say to them, or to give them, and that you’re going to die someday, and that they’re going to wish they’d gotten to know you better? Is there some kind of curiosity gap trick for adults who have become suddenly conscious of their mortality?”

“A new technology can be enriching and exciting for one group of people and create alienation for another;” Hermann concludes, “you don’t have to think the world is doomed to recognize that the present can be a little cruel.”

Well put.

I’m tempted to leave it at that, but I’m left wondering about the whole “generational complaint” business.

To say that something is a generational complaint suggests that we are dealing with old men yelling, “Get off my lawn!” It conjures up the image of hapless adults hopelessly out of sync with the brilliant exuberance of the young. It is, in other words, to dismiss whatever claim is being made. Granted, Hermann has given us a more sensitive and nuanced discussion of the matter, but even in his account too much ground is ceded to this kind of framing.

If we are dealing with a generational complaint, what exactly do we mean by that? Ostensibly that the old are lodging a predictable kind of complaint against the young, a complaint that amounts to little more than an unwillingness to comprehend the new or a desperate clinging to the familiar. Looked at this way, the framing implies that the old, by virtue of their age, are the ones out of step with reality.

But what if the generational complaint is framed rather as a function of coming into responsible adulthood. Hermann approaches this perspective when he writes, “Screen Addiction is a new way for kids to be blithe and oblivious; in this sense, it is empowering to the children, who have been terrible all along.” So when a person complains that they are being ignored by someone enthralled by their device, are they showing their age or merely demanding a basic degree of decency?

Yes, children are wont to be blithe and oblivious, often cruelly indifferent to the needs of others. Traditionally, we have sought to remedy that obliviousness and self-centeredness. Indeed, coming into adulthood more or less entails gaining some measure of control over our naturally self-centered impulses for our own good and for the sake of others. In this light, asking a child–whether age seven or thirty-seven–to lay their device aside long enough to acknowledge the presence of another human being is simply to ask them to grow up.

Others have taken a different tack in response to Brody and Hermann. Jason Kottke arrives at this conclusion:

“People on smartphones are not anti-social. They’re super-social. Phones allow people to be with the people they love the most all the time, which is the way humans probably used to be, until technology allowed for greater freedom of movement around the globe. People spending time on their phones in the presence of others aren’t necessarily rude because rudeness is a social contract about appropriate behavior and, as Hermann points out, social norms can vary widely between age groups. Playing Minecraft all day isn’t necessarily a waste of time. The real world and the virtual world each have their own strengths and weaknesses, so it’s wise to spend time in both.”

Of course. But how do we allocate the time we spend in each–that’s the question. Also, I’m not quite sure what to make of his claim about rudeness and the social contract except that it seems to suggest that it’s not rudeness if you decide you don’t like the terms of the social contract that renders it so. Sorry Grandma, I don’t recognize the social contract by which I’m supposed to acknowledge your presence and render to you a modicum of my attention and affection.

Yes, digital devices have given us the power to decide who is worthy of our attention minute by minute. Advocates of this constant connectivity–many of them, like Facebook, acting out of obvious self-interest–want us to believe this is an unmitigated good and that we should exercise this power with impunity. But–how to say this without sounding alarmist–encouraging people to habitually render other human beings unworthy of their attention seems like a poor way to build a just and equitable society.