Attentional Austerity

A couple of weeks ago I read Cheri Lucas’ “Instapaper and My Ideal Intellectual State” with a certain empathetic resignation. Lucas was finding that a new work situation made it increasingly difficult to keep up with the daily torrent of online information coming through all the usual channels — Twitter, RSS feeds, etc. She looked to Instapaper as a way of keeping up a semblance of keeping up, but to no avail. Instapaper quickly became a repository of what might have been read in some ideal world. A site of aspirational knowledge, a kind of Pinterest for the mind (without all the graphical flair).

I get it. This is where I now live too. I haven’t posted in over two weeks. For those of you who have recently started following The Frailest Thing thanks to the whole toilet paper thing — well, first of all, welcome and thank you. Secondly, I have ordinarily kept up a better clip. Right now it seems to me that the best I’ll be able to do is something like a post each week. Perhaps as things get a bit more routinized, I’ll be able to pick up the pace.

Or maybe not. I’m beginning to think that perhaps a post per week is a pretty good pace to aim for. I’ve been impressed again by the preciousness of attention. Because I have less time to devote to the Sisyphian task of keeping up with the daily digital deluge, I’m becoming increasingly draconian in deciding what deserves my attention. I’m ruthlessly ignoring whole swaths of Twitter-time and savagely gutting my RSS feed.

(As Nick Carr pointed out some time ago, the problem isn’t filter failure. My filters work wonderfully. Everyday they collect swaths of interesting, stimulating, entertaining material. It’s just too much.)

I told some students recently that the most important skill they may ever learn is that of wisely deploying their attention. For the most part we seem to do so carelessly, hearkening every call upon our attention with Pavlovian alacrity. It’s a ruinous habit, better to be misers with our attention.

In other words, we need to impose a regime of attentional austerity to counter continuous partial attention, the default mode arising out of our media environment.

It’s sometimes assumed that in the world the Internet created, those who excel at multi-tasking and endlessly partitioning their attention will have the advantage. I’m not so sure. It rather seems like we are turning our digital devices into horcruxes of the mind. Instead, I’m betting the advantage will go to the person who is able to cancel out the noise and focus with ferocity.

What Would Thoreau Do?

Yesterday, July 12th, was Henry David Thoreau’s 195th birthday, or 195th anniversary of his birth, or however that is best put when the person in question is no longer alive. In any case, Thoreau is best remembered for two things. The first is his experiment in living simply and in greater communion with nature in a cabin on the outskirts of Concord, Massachusetts. The cabin was situated on Walden Pond and Thoreau’s reflections on his “experiment” were later published as Walden.

Thoreau is also remembered for making a better pencil. It seems that Thoreau is actually not generally remembered for this, but it is nonetheless true. His family owned a pencil factory at which Thoreau worked on and off throughout his life. Thanks to his study of German pencil making techniques, Thoreau helped design the best American pencil of its day. Apparently, in the early 19th century, there remained significant technical challenges to the making of a durable pencil, mostly having to do with the sturdiness of the graphite shaft and fitting it into a casing. Among Thoreau’s many accomplishments was the development of a process of manufacturing the pencil that solved these engineering problems.

I thought of Thoreau yesterday not only because it was the anniversary of his birth, but also because I had come across an article titled, “Tweets From the Trail: Technology Can Enhance Your Wilderness Experiences” (h/t to Nathan Jurgenson). The author, novelist Walter Kirn of Montana, had the temerity to suggest that maybe there is something to be gained by brining your technology out into nature with you, rather than venturing into nature in order to escape technology. As you might imagine, many of Kirn’s Montana nature-enthusiast friends were less than pleased.

Now, we should note that these distinctions we make — nature/technology, for example — are a bit complicated. To illustrate here is the opening of a recent, relevant post by Nick Carr:

A couple of cavemen are walking through the woods. One sighs happily and says to the other, “I’m telling you, there’s nothing like being out in nature.” The other pauses and says, “What’s nature?”

It’s 1972. A pair of lovers go camping in a wilderness area in a national park. They’re sitting by a campfire, taking in the evening breezes. “Honey,” says the woman, “I have to confess I really love being offline.” The guy looks at her and says, “What’s offline?”

You see the point. Our idea of “nature” owes something to the advance of technology just as our idea of “offline” necessitates the emergence of online. But back to Kirn’s article. He discovered that his writing flourished when he set up a work station on an old wooden telephone wire spool under the big, blue Montana sky with badgers and gophers scampering all about. Subsequently he made a habit of screening movies on his iPad in “natural” settings such as the seaside or the shores of a river. Finally, he confesses to the manner in which being out in the wilderness inspires fits of creativity that he feels compelled to tweet and post. And here is his eloquent conclusion:

“To sever our experience of wilderness from our use of technology now seems to me an unnatural act, shortsighted and unimaginative. No one appreciates a ringing cell phone while they float on a muddy river through western badlands or stand in the saddle between two massive mountain ranges, but short of such rude interruptions of heavenly moments, technology has a mysterious way, at times, of providing the perfect contrast, the happy counterpoint to scenes and experiences and settings that are easy to take for granted or grow numb to. Along with harmony, contrast is one of the two great rules of art. It wakes the senses, jars the tired mind, breaks up routines that threaten to grow mechanical. If you don’t believe me, try it. Travel to that secluded spot you keep returning to, the one where you go to leave the world behind, and turn on some music, play a movie, capture a passing thought and send it onward, out of the forest, out into society, and then wait, while the wind blows and the treetops sway and the clouds pile up a mile above your head, for someone, some faraway stranger, to reply. Even when we’re alone, we’re not alone, this proves, and in the deepest heart of every wilderness lurks a miracle, often, the human mind.”

I can’t help but wonder, what would Thoreau think? I can’t pretend to know Thoreau well enough to answer that question. I suspect that present day technophile’s would suggest that Thoreau ought to approve, after all he took his pencil to Walden and that was a technology. Well, yes, but he didn’t string a telegraph wire to the cabin.

I wouldn’t discount the dynamic Kirn describes, particularly since it is measured (let’s do without the ringing cell phone) and it still recognizes the contrast. The juxtaposition of unlike things can be creatively stimulating, and if that is what you are after, then Kirn’s formula may indeed yield something for you.

But what if your aims are different? What if you’re seeking only to listen and not to speak? What if your goal is not to be inspired toward yet another act of self-expression? We may carry technology with us into nature, in fact, we may carry it within us. But this does not mean that we ought always to answer to its prerogatives. Nor does it mean that we should always assume the posture toward reality that technology enables and the frame of mind that it encourages. And, of course, different technologies enable and encourage differently. It is the difference between the pencil and the telegraph and the smartphone.

I am not against human civilization (which is a silly thing to have to say), and the human mind, as Kirn puts it, is a “miracle” indeed. But the miracle of the human mind lies not only in its ability to create and to build and to express itself and impose its own symbolic order on the world. The miracle lies also in its ability to listen and to receive and to contemplate and to be itself re-ordered; to be taken in by the world as well as to take the world in. Perceiving the value of such a stance draws us into an awareness of the various ethical or philosophical frames that inform our evaluations. I cannot sort all of those out, but I can acknowledge that for a wide array of people the point would not be to speak, but to be spoken to. Or perhaps, even to find that we are not addressed at all.

An even greater array of people would likely agree that our posture toward this world ought to be more than merely instrumental. Human civilization must advance, but it does so best when it abandons Promethean aspirations and acknowledges its finitude along with its power.

I suppose all of this is a way of saying that beauty resides not only in what we make and say, but also in what we find and encounter. But shouldn’t this found beauty be shared? Maybe. But perhaps not before it has done its work on us. Perhaps not before we have allowed it to speak to us and to transform us. The space in which beauty can do its work is precious, and it would seem that the logic of our technologies would have us collapse that space in the service of sharing, commodification, self-expression, capturing, publicizing, and the like.

I don’t want to speak for Thoreau, but I would venture to guess that he might have us preserve that precious space where beauty has its way.

“Is the Internet Driving Us Mad?”: Attempting a Sane Response

It’s Newsweek’s turn. Perhaps feeling a bit of title envy, the magazine has shamelessly ripped off The Atlantic’s formula with a new cover story alternatively titled “Is the Internet Driving Us Mad?” or “Is the Internet Making Us Crazy?”

You can probably guess what follows already. Start with a rather dramatic and telltale anecdote — in this case Jason Russell’s “reactive psychosis” in the wake of his “Kony 2012” fame — and proceed to cite a number of studies and further anecdotes painting a worrisome picture that appears somehow correlated to heavy Internet use. You’re also likely to guess that Sherry Turkle is featured prominently. For the record, I intend no slight by that last observation, I rather appreciate Turkle’s work.

Before the day is out, this article will be “Liked” and tweeted thousands of times I’m sure. It will also be torn apart relentlessly and ridiculed. Unfortunately, the title will be responsible for both responses. A more measured title would likely have elicited a more sympathetic reading, but also less traffic.

There is much in the article that strikes an alarmist note and many of the anecdotes, while no doubt instances of real human suffering, seem to describe behaviors that are not characteristic of most people using the Internet. I’m also not one to go in for explanations that localize this or that behavior in this or that region of the brain. I’m not qualified to evaluate such conclusions, but they strike me as a tad reductionistic. That said, this does ring true:

“We may appear to be choosing to use this technology, but in fact we are being dragged to it by the potential of short-term rewards. Every ping could be social, sexual, or professional opportunity, and we get a mini-reward, a squirt of dopamine, for answering the bell. “These rewards serve as jolts of energy that recharge the compulsion engine, much like the frisson a gambler receives as a new card hits the table,” MIT media scholar Judith Donath recently told Scientific American. “Cumulatively, the effect is potent and hard to resist.”

Of course, it all hinges on correlation and causation. The article itself suggests as much right at the end. “So what do we do about it?” the author, Tony Dokoupil, asks. “Some would say nothing,” he continues, “since even the best research is tangled in the timeless conundrum of what comes first.” He adds:

“Does the medium break normal people with its unrelenting presence, endless distractions, and threat of public ridicule for missteps? Or does it attract broken souls?

But in a way, it doesn’t matter whether our digital intensity is causing mental illness, or simply encouraging it along, as long as people are suffering.”

Dokoupil concludes on a note of fateful optimism: “The Internet is still ours to shape. Our minds are in the balance.” In truth, whatever we make of the neuroscience involved, there is something to be said for the acknowledgement that we have choices to make about our relationship to the Internet.

After we cut through the hype and debates about causation, it would seem that there is a rather commonsensical way of approaching all of this.

If the article resonates with you because it seems to describe your experience, then it is probably worth taking it as a warning of sorts and an invitation to make some changes. So, for example, if you are sitting down to play video games for multiple hours without interruption and thus putting yourself in danger of developing a fatal blood clot … you may want to rethink your gaming habits.

If you are immediately reaching for your Internet device of choice before you have so much as opened your eyes in the morning, you may want to consider allowing yourself a little time before diving into the stream. Pour some coffee, have a nice breakfast. Give yourself a little time with your thoughts, or your loved ones. If you find that your thoughts are all incessantly begging you to get online and your loved one’s faces are turning disconcertingly into Facebook icons, then perhaps you have all the more reason to restore a little a balance to your Internet use.

If you take an honest inventory of your emotions and feelings, and you find that you’re anchoring a great deal of your self-worth on the replies you get to your status updates, tweets, or blog posts, then perhaps, just maybe, you might want to consider doing a little soul searching and re-prioritizing.

If “‘the computer is like electronic cocaine,’ fueling cycles of mania followed by depressive stretches” somehow resonates with your own experience, then it may be time to talk to your friends about getting a better grip on the way you’re ordering your life.

None of this is rocket science (or neuroscience). I’d like to think that most of us have a pretty good sense of when certain activities are tending toward the counterproductive and unhealthy end of the spectrum. There’s no need to get all apocalyptic about it, we can leave that to the media. Instead, take inventory of what is truly important — you know, the same old sappy but precious and indispensable stuff — and then ask yourself how your Internet use relates to these and take action accordingly. If you find that you are unable to implement the action you know you need to take, then certainly get some help.

In Search of the Real

While advancing age is no guarantee of advancing self-knowledge, I have found that growing up a bit can be enlightening. Looking back, it now seems pretty clear to me that I have always been temperamentally Arcadian – and I’m grateful to W. H. Auden for helping me come to this self-diagnosis. In the late 1940s, Auden wrote an essay distinguishing the Arcadian and Utopian personalities. The former looks instinctively to the past for truth, goodness, and beauty; the latter searches for those same things in the unrealized future.

Along with Auden, but in much less distinguished fashion, I am an Arcadian; there is little use denying it. When I was on the cusp of adolescence, I distinctly recall lamenting with my cousin the passing of what we called the “good old days.” Believe it; it is sadly true. The “good old days” incidentally were the summer vacations we enjoyed not more than two or three years earlier. If I am not careful, I risk writing the grocery list elegiacally. I believe, in fact, that my first word was a sigh. This last is not true, alas, but it would not have been out of character.

So you can see that this presents a problem of sorts for someone who writes about technology. The temptation to criticize is ever present and often difficult to resist. With so many Utopians about, one can hardly be blamed. In truth, though, there are plenty of Arcadians about as well. The Arcadian is the critic of technology, the one whose first instinct is to mourn what is lost rather than celebrate what is gained. It is with this crowd that I instinctively run. They are my kindred spirits.

But Auden knew enough to turn his critical powers upon his own Arcadianism. As Alan Jacobs put it in his Introduction to Auden’s “The Age of Anxiety,” “Arcadianism may have contributed much to Auden’s mirror, but he knew that it had its own way of warping reflections.” And so do I, at least in my better moments.

I acknowledge my Arcadianism by way of self-disclosure leading into a discussion of Nathan Jurgenson’s provocative essay in The New Inquiry, “The IRL Fetish.” IRL here stands for “in real life,” offline experience as opposed to the online or virtual, and Jurgenson takes aim at those who fetishize offline experience. I can’t be certain if he had Marx, Freud, or Lacan in view when he chose to describe the obsession with offline experience as a fetish. I suspect it was simply a rather suggestive term that connoted something of the irrational and esoteric. But it does seem clear that he views this obsession/fetish as woefully misguided at best and this because it is built on an erroneous conceptualization of the relationship between the online and the offline.

The first part of Jurgenson’s piece describes the state of affairs that has given rise to the IRL Fetish. It is an incisive diagnosis written with verve. He captures the degree to which the digital has penetrated our experience with clarity and vigor. Here is a sampling:

“Hanging out with friends and family increasingly means also hanging out with their technology. While eating, defecating, or resting in our beds, we are rubbing on our glowing rectangles, seemingly lost within the infostream.” [There is more than one potentially Freudian theme running through this piece.]

“The power of ‘social’ is not just a matter of the time we’re spending checking apps, nor is it the data that for-profit media companies are gathering; it’s also that the logic of the sites has burrowed far into our consciousness.”

“Twitter lips and Instagram eyes: Social media is part of ourselves; the Facebook source code becomes our own code.”

True. True. And, true.

From here Jurgenson sums up the “predictable” response from critics: “the masses have traded real connection for the virtual,” “human friends, for Facebook friends.” Laments are sounded for “the loss of a sense of disconnection,” “boredom,” and “sensory peace.” The equally predictable solution, then, is to log-off and re-engage the “real” world.

Now it does not seem to me that Jurgenson thinks this is necessarily bad counsel as far as it goes. He acknowledges that, “many of us, indeed, have been quite happy to occasionally log-off …” The real problem, according to Jurgenson, what is “new” in the voices of the chorus of critics is arrogant self-righteousness. Those are my words, but I think they do justice to Jurgenson’s evaluation. “Immense self-satisfaction,” “patting ourselves on the back,” boasting, “self-congratulatory consensus,” constructing “their own personal time-outs as more special” – these are his words.

This is a point I think some of Jurgenson’s critics have overlooked. At this juncture, his complaint is targeted rather precisely, at least as I read it, at the self-righteousness implicit in certain valorizations of the offline. Now, of course, deciding who is in fact guilty of self-righteous arrogance may involve making judgment calls that more often than not necessitate access to a person’s opaque intentions, and there is, as of yet, no app for that. (Please don’t tell me if there is.) But, insofar as we are able to reasonably identify the attitudes Jurgenson takes to task, then there is nothing particularly controversial about calling them out.

In the last third of the essay, Jurgenson pivots on the following question: “How have we come to make the error of collectively mourning the loss of that which is proliferating?” Response: “In great part, the reason is that we have been taught to mistakenly view online as meaning not offline.”

At this point, I do want to register a few reservations. Let me begin with the question above and the claim that “offline experience” is proliferating. What I suspect Jurgenson means here is that awareness of offline experience and a certain posture toward offline experience is proliferating. And this does seem to be the case. Semantically, it would have to be. The notion of the offline as “real” depends on the notion of the online; it would not have emerged apart from the advent of the online. The online and the offline are mutually constitutive as concepts; as one advances, the other follows.

It remains the case, however, that “offline,” only recently constituted as a concept, describes an experience that paradoxically recedes as it comes into view. Consequently, Jurgenson’s later assertion – “There was and is no offline … it has always been a phantom.” – is only partially true. In the sense that there was no concept of the offline apart from the online and that the online, once it appears, always penetrates the offline, then yes, it is true enough. However, this does not negate the fact that while there was no concept of the offline prior to the appearance of the online, there did exist a form of life that we can retrospectively label as offline. There was, therefore, an offline (even if it wasn’t known as such) experience realized in the past against which present online/offline experience can be compared.

What the comparison reveals is that a form of consciousness, a mode of human experience is being lost. It is not unreasonable to mourn its passing, and perhaps even to resist it. It seems to me that Jurgenson would not necessarily be opposed to this sort of rear-guard action if it were carried out without an attendant self-righteousness or aura of smug superiority. But he does appear to be claiming that there is no need for such rear-guard actions because, in fact, offline experience is as prominent and vital as it ever was. Here is a representative passage:

“Nothing has contributed more to our collective appreciation for being logged off and technologically disconnected than the very technologies of connection. The ease of digital distraction has made us appreciate solitude with a new intensity. We savor being face-to-face with a small group of friends or family in one place and one time far more thanks to the digital sociality that so fluidly rearranges the rules of time and space. In short, we’ve never cherished being alone, valued introspection, and treasured information disconnection more than we do now.”

It is one thing, however, to value a kind of experience, and quite another to actually experience it. It seems to me, in fact, that one portion of Jurgenson’s argument may undercut the other. Here are his two central claims, as I understand them:

1. Offline experience is proliferating, we enjoy it more than ever before.

2. Online experience permeates offline experience, the distinction is untenable.

But if the online now permeates the offline – and I think Jurgenson is right about this – then it cannot also be the case that offline experience is proliferating. The confusion lies in failing to distinguish between “offline” as a concept that emerges only after the online appears, and “offline” as a mode of experience unrecognized as such that predates the online. Let us call the former the theoretical offline and the latter the absolute offline.

Given the validity of claim 2 above, then claim 1 only holds for the theoretical offline not the absolute offline. And it is the passing of the absolute offline that critics mourn. The theoretical offline makes for a poor substitute.

The real strength of Jurgenson’s piece lies in his description of the immense interpenetration of the digital and material (another binary that does not quite hold up, actually). According to Jurgenson, “Smartphones and their symbiotic social media give us a surfeit of options to tell the truth about who we are and what we are doing, and an audience for it all, reshaping norms around mass exhibitionism and voyeurism.” To put it this way is to mark the emergence of a ubiquitous, unavoidable self-consciousness.

I would not say as Jurgenson does at one point, “Facebook is real life.” The point, of course, is that every aspect of life is real. There is no non-being in being. Perhaps it is better to speak of the real not as the opposite of the virtual, but as that which is beyond our manipulation, what cannot be otherwise. In this sense, the pervasive self-consciousness that emerges alongside the socially keyed online is the real. It is like an incontrovertible law that cannot be broken. It is a law haunted by the loss its appearance announces, and it has no power to remedy that loss. It is a law without a gospel.

Once self-consciousness takes its place as the incontrovertibly real, it paradoxically generates a search for something other than itself, something more real. This is perhaps the source of what Jurgenson has called the IRL fetish, and in this sense it has something in common with the Marxian and Freudian fetish: it does not know what it seeks. The disconnection, the unplugging, the logging off are pursued as if they were the sought after object. But they are not. The true object of desire is a state of pre-digital innocence that, like all states of innocence, once lost can never be recovered.

Perhaps I spoke better than I knew when I was a child, of those pleasant summers. After all, I am of that generation for which the passing from childhood into adulthood roughly coincided with the passage into the Digital Age. There is a metaphor in that observation. To pass from childhood into adulthood is to come into self-awareness, it is to leave naivety and innocence behind. The passage into the Digital Age is also a coming into a pervasive form of self-awareness that now precludes the possibility of naïve experience.

All in all, it would seem that I have stumbled into my Arcadianism yet again.

Arts of Memory 2.0: The Sherlock Update

I’m a little behind the times, I know, but I just finished watching the second episode in the second season of Sherlock, “The Hounds of Baskerville.” If you’ve watched the episode, you’ll remember a scene in which Sherlock asks to be left alone so that he may enter his “Mind Palace.” (If you’ve not been watching Sherlock, you really ought to.) The “Mind Palace” in question turns out to be what has traditionally been called a memory palace or memory theater. It is the mental construct at the heart of the ancient ars memoria, or arts of memory.

Longtime (and long-suffering) readers will remember a handful of posts discussing this ancient art and also likening Facebook to something like a materialized memory palace (here, here, and here). To sum up:

“… the basic idea is that one constructs an imagined space in the mind (similar to the work of the Architect in the film Inception, only you’re awake) and then populates the space with images that stand in for certain ideas, people, words, or whatever else you want to remember.  The theory is that we remember images and places better than we do abstract ideas or concepts.”

And here is the story of origins:

“… the founding myth of what Frances Yates has called the “art of memory” as recounted by Cicero in his De oratore. According to the story, the poet Simonides of Ceos was contracted by Scopas, a Thessalian nobleman, to compose a poem in his honor.  To the nobleman’s chagrin, Simonides devoted half of his oration to the praise of the gods Castor and Pollux.  Feeling himself cheated out of half of the honor, Scopas brusquely paid Simonides only half the agreed upon fee and told him to seek the rest from the twin gods.  Not long afterward that same evening, Simonides was summoned from the banqueting table by news that two young men were calling for him at the door.  Simonides sought the two callers, but found no one.  While he was out of the house, however, the roof caved in killing all of those gathered around the table including Scopas. As Yates puts it, “The invisible callers, Castor and Pollux, had handsomely paid for their share in the panegyric by drawing Simonides away from the banquet just before the crash.

The bodies of the victims were so disfigured by the manner of death that they were rendered unidentifiable even by family and friends.  Simonides, however, found that he was able to recall where each person was seated around the table and in this way he identified each body.  This led Simonides to the realization that place and image were the keys to memory, and in this case, also a means of preserving identity through the calamity of death.”

The most interesting thing about the manner in which Sherlock presents the memory palace is that it has been conceived on the model of something like a touchscreen interface. You can watch the clip below to see what I mean. In explaining what Holmes is doing to a third party, Watson describes something like a traditional memory palace (not in the video clip). But what we see him doing is quite different. Rather than mentally walking through an architectural space, Holmes swipes at images (visualized for the audience) organized into something like an alphabetic multi-media database.

Surprisingly, though, this stripped down structure does have a precedent in the medieval practice of the arts of memory. Ivan Illich describes what 12th century scholar Hugh of St. Victor required of his pupils:

“… Hugh asks his pupils to acquire an imaginary inner space … and tells them how to proceed in its construction. He asks the pupil to imagine a  sequence of whole numbers, to step on the originating point of their run and let the row reach the horizon. Once these highways are well impressed upon the fantasy of the child, the exercise consists in mentally ‘visiting’ these numbers at random. In his imagination the student is to dart back and forth to each of the spots he has marked by a roman numeral.”

This flat and bare schematic was the foundation for more elaborate, three dimensional memory palaces to be built later.

The update to the memory theater is certainly not out of keeping with the spirit of the tradition which always looked to familiar spaces as a model. What more familiar space can we conceive of these days than the architecture of our databases. Thought experiment: Visualize your Facebook page. Can you do it? Can you scroll through it? Can you mentally click and visualize new pages? Can you scroll through your friends? Might you even be able to mentally scroll through your pictures? Well, there you have it; you have a memory palace and you didn’t even know it.