Freedom From Authenticity

Last night I listened to a recording of David Foster Wallace’s Kenyon College commencement address.  I know, I know. Wallace is one of these people around whom personality cults form, and its hard to take those people seriously. If it helps, there’s this one guy who is really ticked at Wallace for what must have been some horrible thing Wallace did to him, like having had the temerity to be alive at the same time as he. I also know that Wallace could at times be a rather nasty human being, or so some have reported. That said, the man said some really important and true things which need to be heard again and again.

These things as it turns out, or as I hear them now, in this particular frame of mind that I am in, have everything to do with authenticity. This is not because Wallace is talking directly about authenticity and its discontents, but because he understands, intimately it seems, what it feels like to be the sort of person for whom authenticity is likely to become a problem, and without intending to propose a solution to this problem of authenticity, he does.

Authenticity becomes a problem the second it becomes a question. As William Deresiewicz put it, “the search for authenticity is futile. If you have to look for it, you’re not going to find it.” Authenticity, like happiness and love and probably everything that is truly significant in life partakes of this dynamic whereby the sought after thing can be attained only by not consciously seeking after it. Think of it, and now it is a problem; seek it, and you will not find it; focus on it, and it becomes elusive.

So authenticity is the sort of thing that vanishes the moment you become conscious of it. It’s what you have only when you’re not thinking of it. And what you’re not thinking of when you have it is yourself. Authenticity is a crisis of self invoked by a hyper-selfawareness that makes it impossible not think of oneself. And I don’t think this is a matter of being a horribly selfish or arrogant person. No, in fact, I think this kind of hype-rselfawareness is more often than not burdened with insecurity and fear and anxiety. It’s a voice most people want to shut up and hence the self-defeating quest for authenticity.

What does Wallace have to say about any of this? Well, first, there’s this: “Here is just one example of the total wrongness of something I tend to be automatically sure of: everything in my own immediate experience supports my deep belief that I am the absolute centre of the universe; the realest, most vivid and important person in existence. We rarely think about this sort of natural, basic self-centredness because it’s so socially repulsive.”

This is what he calls our default setting. Our default setting is to think about the world as if we were its center, to process every situation through the grid of our own experience, to assume “that my immediate needs and feelings are what should determine the world’s priorities.” This is our default setting in part because from the perspective of our own experience, the only perspective to which we have immediate access, we are literally the center of the universe.

Wallace also issued this warning: “Worship power you will end up feeling weak and afraid, and you will need ever more power over others to numb you to your own fear. Worship your intellect, being seen as smart, you will end up feeling stupid, a fraud, always on the verge of being found out.”

So then, worship authenticity and … 

But, Wallace also tells us, it doesn’t have to be this way. The point of a liberal arts education — this is a commencement address after all — is to teach us how to exercise choice over what we think and what we pay attention to. And Wallace urges us to pay attention to something other than the monologue inside our head. Getting out of our own heads, what Wallace called our “skull-sized kingdoms” — this is the only answer to the question of authenticity.

And so this makes me think again of the possibility that certain kinds of practices that help us do just this. They can so focus our attention on themselves, that we stop, for a time, paying attention to ourselves. Serendipitously, I stumbled on this video about glass-blowing in which a glass-blower is talking about his craft when he says this: “When you’re blowing glass, there really isn’t time to have your mind elsewhere – you have to be 100% engaged.” There it is.

Now, I know, we can’t all run off and take up glass blowing. That would be silly and potentially dangerous. The point is that this practice has the magical side effect of taking a person out of their own head by acutely focusing our attention. The leap I want to make now is to say that this skill is transferable. Learn the mental discipline of so focusing your attention in one particular context and you will be better able to deploy it in other circumstances.

It’s like the ascetic practice of fasting. The point is not that food is bad or that denying yourself food is somehow virtuous or meritorious. Its about training the will and learning how to temper desire so as to direct and deploy it toward more noble ends. You train your will with food so that you can exercise it meaningfully in other, more serious contexts.

In any case, Wallace is right. It’s hard work not yielding to our default self-centeredness. “The really important kind of freedom,” Wallace explained, “involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day.” I know I’ve cited that line before and not that long ago, but the point it makes is crucial.

Freedom is not about being able to do whatever we want, when we want. It has nothing to do with listening to our heart or following our dreams or whatever else we put on greeting cards and bumper stickers. Real freedom comes from learning to get out of our “skull-sized kingdoms” long enough to pay attention to the human being next us so that we might treat them with decency and kindness and respect. Then perhaps we’ll have our authenticity, but we’ll have it because we’ve stopped caring about it.

_______________________________________

A transcript of Wallace’s address is available here.

The Road Makes All the Difference

“There is a subtle but fundamental difference between finding direction … and ambiently knowing direction.” This was Evgeny Morozov’s recent encapsulation of Tristan Gooley’s The Natural Navigator. According to Gooley, natural navigation, that is navigation by the signs nature yields, is a dying art, and its passing constitutes a genuine loss.

Even if you’ve not read The Natural Navigator, or Morozov’s review, your thoughts have probably already wandered toward the GPS devices we rely on to help us find direction. It’s a commonplace to note how GPS has changed our relationship to travel and place. In “GPS and the End of the Road,” Ari Schulman writes of the GPS user: “he checks first with the device to find out where he is, and only second with the place in front of him to find out what here is.” One may even be tempted to conclude that GPS has done to our awareness of place what cell phones did to our recall of phone numbers.

(Of course, this is not a brief against GPS, much less against maps and compasses and street signs. These have their place, of course. And there are a number of other qualifications which could be offered, but I’ll trust to your generosity as readers and assume that these are understood.)

From the perspective of natural navigation, however, GPS is just one of many technologies designed to help us find our way that simultaneously undermine the possibility that we might also come to know our way or, we might add, our place. Navigational devices, after all, enter into our phenomenological experience of place and do so in a way that is not without consequence.

When I plug an address into a GPS device, I expect one thing: an efficient and unambiguous set of directions to get me from where I am to where I want to go. My attention is distributed between the device and the features of the place. The journey is eclipsed, the places in between become merely space traversed. In this respect, GPS becomes a sign of the age in its struggle to consider anything but the accomplishment of ends with little regard for the means by which the ends are accomplished. We seem to forget, in other words, that there are goods attending the particular path by which a goal is pursued that are independent of the accomplishment of that goal.

It’s a frame of mind demanded, or to put it in less tech-derministic terms, encouraged by myths of technological progress. If I am to enthusiastically embrace every new technology, I must first come to believe that means or media matter little so long as the end is accomplished. A letter is a call is an email is a text message. Consider this late nineteenth century French cartoon imagining the year 2000 (via Explore):

From our vantage point, of course, this seems silly … but only in execution, not intent. Our more sophisticated dream replaces the odd contraption with the Google chip. Both err in believing that education is reducible to the transfer of data. The means are inconsequential and interchangeable, the end is all that matters, and it is a vastly diminished end at that.

Natural navigation and GPS may both get us where we want to go, but how they accomplish this end makes all the difference. With the former, we come to know the place as place and are drawn into more intimate relationship with it. We become more attentive to the particulars of our environment. We might even find that a certain affection insinuates itself into our experience of the place. Affective aspects of our being are awakened in response. When a new technology promises to deliver greater efficiency by eliminating some heretofore essential element of human involvement, I would bet that it also effects an analogous alienation. In such cases, then, we have been carelessly trading away rich and rewarding aspects of human experience.

Sometimes it is the road itself that makes all the difference.

The Wisdom of Gandalf for the Information Age

Tolkien is in the air again. In December of this year, the eagerly awaited first part of Peter Jackson’s The Hobbit will be released. The trailers for the film have kindled a great deal of excitement and the film promises to be a delight for fans of one of the most beloved stories ever written. By some estimates it is the fourth best selling book ever. Ahead of it are A Tale of Two Cities in the top spot, The Little Prince, and Tolkien’s own The Lord of the Rings trilogy.

I happily count myself among the Tokien devotees and I’ve declared this my very own Year of Tolkien. Basically this means that when I’m not reading some thing I must read, I’ll be reading through Tolkien’s works and a book or two about Tolkien.

Reading The Fellowship of the Ring several days ago, I was captivated once again by an exchange between the wizard Gandalf and the hobbit Frodo. If you’re familiar with the story, then the following needs no introduction. If you are not, you really ought to be, but here’s what you need to know to make sense of this passage. In The Hobbit, the main character, Bilbo, passes on an opportunity to kill a pitiable but sinister creature known as Gollum. The creature had been corrupted by the Ring of Power, which had been forged by an evil being known as Sauron. It would take too long to explain more, but in the following exchange Gandalf is retelling those events to Bilbo’s nephew Frodo.

Frodo has just proclaimed, “What a pity that Bilbo did not stab that vile creature, when he had a chance!”

Here is the ensuing exchange:

‘Pity? It was Pity that stayed his hand. Pity, and Mercy: not to strike without need …

‘I am sorry,’ said Frodo. ‘But I am frightened; and I do not feel any pity for Gollum.’

“You have not seen him,’ Gandalf broke in.

‘No, and I don’t want to,’ said Frodo. ‘I can’t understand you. Do you mean to say that you, and the Elves, have let him live on after all those horrible deeds? Now at any rate he is as bad as an Orc, and just an enemy. He deserves death.’

‘Deserves it! I daresay he does. Many that live deserve death. And some that die deserve life. Can you give it to them? Then do not be too eager to deal out death in judgement. For even the very wise cannot see all ends. I have not much hope that Gollum can be cured before he dies, but there is a chance of it. And he is bound up with the fate of the Ring. My heart tells me that he has some part to play yet, for good or ill, before the end; and when that comes, the pity of Bilbo may rule the fate of many — yours not least …’

This is, in my estimation, among the most evocative portions of a body of writing replete with wise and often haunting passages. Among the many things that could be said about this exchange and the many applications one could draw, I will note just one.

Gandalf is renowned for his wisdom, he is after all a great wizard. But in what does his wisdom lie? In this particular instance, it lies not in what he knows, rather it lies in his awareness of the limits of his knowledge. His wisdom issues from an awareness of his ignorance. “Even the very wise cannot see all ends,” he confesses. He does not have much hope for Gollum, but he simply does not know and he will not act, nor commend an action, that would foreclose the possibility of Gollum’s redemption. Moreover, he does not know what part Gollum may play in the unfolding story. For these reasons, which amount to an acknowledgement of his ignorance, Gandalf’s actions and judgments are tempered and measured.

These days we are enthralled by the information at our command. We are awestruck by what we know. The data available to us constitutes an embarrassment of riches. And yet one thing we lack: an awareness of how much we nevertheless do not know. We have forgotten our ignorance and we are judging and acting without the compassion or wisdom of Gandalf because we lack his acute awareness of the limitations of his knowledge.

We do not have to search very far at all to find those who will rush to judgment and act out of a profound arrogance. I will let you supply the examples. They are too many to list in any case. More than likely we need not look further than ourselves.

I have more than once cited T. S. Eliot’s lament from “Choruses from ‘The Rock'”:

“Where is the Life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?”

Our problem is that we tend to think of the passage from information to knowledge and on to wisdom as a series of aggregations. We accumulate enough information and we pass to knowledge and we accumulate enough knowledge and we pass to wisdom. The truth is that we pass to wisdom not by the aggregation of information or knowledge, both of which are available as never before; we pass to wisdom by remembering what we do not know. And this, in an age of information, seems to be the one thing we cannot keep in mind.

The Internet & the Youth of Tomorrow: Highlights from the Pew Survey

The Pew Internet & American Life Project conducted “an opt-in, online survey of a diverse but non-random sample of 1,021 technology stakeholders and critics … between August 28 and October 31, 2011.” The survey presented two scenarios for the youth of 2020, asked participants to choose which they thought more likely, and then invited elaboration.

Here are the two scenarios and the responses they garnered:

Some 55% agreed with the statement:

In 2020 the brains of multitasking teens and young adults are “wired” differently from those over age 35 and overall it yields helpful results. They do not suffer notable cognitive shortcomings as they multitask and cycle quickly through personal- and work- related tasks. Rather, they are learning more and they are more adept at finding answers to deep questions, in part because they can search effectively and access collective intelligence via the internet. In sum, the changes in learning behavior and cognition among the young generally produce positive outcomes.

Some 42% agreed with the opposite statement, which posited:

In 2020, the brains of multitasking teens and young adults are “wired” differently from those over age 35 and overall it yields baleful results. They do not retain information; they spend most of their energy sharing short social messages, being entertained, and being distracted away from deep engagement with people and knowledge. They lack deep-thinking capabilities; they lack face-to-face social skills; they depend in unhealthy ways on the internet and mobile devices to function. In sum, the changes in behavior and cognition among the young are generally negative outcomes.

However the report also noted the following:

While 55% agreed with the statement that the future for the hyperconnected will generally be positive, many who chose that view noted that it is more their hope than their best guess, and a number of people said the true outcome will be a combination of both scenarios.

In all honesty, I am somewhat surprised the results split so evenly. I would have expected the more positive scenario to perform better than it did. The most interesting aspect of the report, however, are of course the excerpts presented from the respondents’ elaborations. Here are few with some interspersed commentary.

A number of respondents wrote about the skills that will be valued in the emerging information ecosystem:

  • There are concerns about new social divides. “I suspect we’re going to see an increased class division around labor and skills and attention,” said media scholar danah boyd.
  • “The essential skills will be those of rapidly searching, browsing, assessing quality, and synthesizing the vast quantities of information,” wrote Jonathan Grudin, principal researcher at Microsoft. “In contrast, the ability to read one thing and think hard about it for hours will not be of no consequence, but it will be of far less consequence for most people.”

Among the more interesting excerpts was this from Amber Case, cyberanthropologist and CEO of Geoloqi:

  • “The human brain is wired to adapt to what the environment around it requires for survival. Today and in the future it will not be as important to internalize information but to elastically be able to take multiple sources of information in, synthesize them, and make rapid decisions … Memories are becoming hyperlinks to information triggered by keywords and URLs. We are becoming ‘persistent paleontologists’ of our own external memories, as our brains are storing the keywords to get back to those memories and not the full memories themselves.”
I’m still not convinced at all by the argument against internalization. (You can read why here, here, and here.) But she is certainly correct about our “becoming ‘persistent paleontologists’ of our own external memory.” And the point was memorably put as well. We are building vast repositories of external memory and revisiting those stores in ways that are historically novel. We’ve yet to register the long term consequences.

The notion of our adaptability to new information environments was also raised frequently:

  • Cathy Cavanaugh, an associate professor of educational technology at the University of Florida, noted, “Throughout human history, human brains have elastically responded to changes in environments, society, and technology by ‘rewiring’ themselves. This is an evolutionary advantage and a way that human brains are suited to function.”

This may be true enough, but what is missing from these sorts of statements is any discussion of which environments might be better or worse for human beings. To acknowledge that we adapt is to say nothing about whether or not we ought to adapt. Or, if one insists, Borg-like, that we must adapt or die, there is little discussion about whether this adaptation leaves us on the whole better off. In other words, we ought to be asking whether the environment we are asked to adapt to is more or less conducive to human flourishing. If it is not, then all the talk of adaptation is a thinly veiled fatalism.

Some, however, did make strong and enthusiastic claims for the beneficence of the emerging media environment:

  • “The youth of 2020 will enjoy cognitive ability far beyond our estimates today based not only on their ability to embrace ADHD as a tool but also by their ability to share immediately any information with colleagues/friends and/or family, selectively and rapidly. Technology by 2020 will enable the youth to ignore political limitations, including country borders, and especially ignore time and distance as an inhibitor to communications. There will be heads-up displays in automobiles, electronic executive assistants, and cloud-based services they can access worldwide simply by walking near a portal and engaging with the required method such as an encrypted proximity reader (surely it will not be a keyboard). With or without devices on them, they will communicate with ease, waxing philosophic and joking in the same sentence. I have already seen youths of today between 20 and 35 who show all of these abilities, all driven by and/or enabled by the internet and the services/technologies that are collectively tied to and by it.”

This was one of the more techno-uptopian predictions in the survey. The notion of “embracing ADHD as a tool” is itself sufficiently jarring to catch one’s attention. One gets the gist of what the respondent is foreseeing — a society in which cognitive values have been radically re-ordered. Where sustained attention is no longer prized, attention deficit begins to seem like a boon. The claims about the irrelevance of geographic and temporal limits are particularly interesting (or disconcerting). They seemingly make a virtue of disembodied rootlessness. The youth of the future will, in this scenario, be temporally and spatially homeless, virtually dispersed. (The material environment of the future imagined here also invites comparison to the dystopian vision of the film Wall-E.)

Needless to say, not all respondents were nearly so sanguine. Most interestingly, many of the youngest respondents were among the most concerned:

  • A number of the survey respondents who are young people in the under-35 age group—the central focus of this research question—shared concerns about changes in human attention and depth of discourse among those who spend most or all of their waking hours under the influence of hyper connectivity.

This resonates with my experience teaching as well. There’s a palpable unease among many of the most connected with the pace, structure, and psychic consequences of the always on life. They appear to be discovering through experience what is eloquently put by Annette Liska:

  • Annette Liska, an emerging-technologies design expert, observed, “The idea that rapidity is a panacea for improved cognitive, behavioral, and social function is in direct conflict with topical movements that believe time serves as a critical ingredient in the ability to adapt, collaborate, create, gain perspective, and many other necessary (and desirable) qualities of life. Areas focusing on ‘sustainability’ make a strong case in point: slow food, traditional gardening, hands- on mechanical and artistic pursuits, environmental politics, those who eschew Facebook in favor of rich, active social networks in the ‘real’ world.”

One final excerpt:

  • Martin D. Owens, an attorney and author of Internet Gaming Law, Just as with J.R.R. Tolkien’s ring of power, the internet grants power to the individual according to that individual’s wisdom and moral stature. Idiots are free to do idiotic things with it; the wise are free to acquire more wisdom. It was ever thus.

In fact, the ring in Tolkien’s novels is a wholly corrupting force. The “wisdom and moral stature” of the wearer may only forestall the deleterious effects. The most wise avoided using it at all. I won’t go so far as to suggest that the same applies to the Internet, but I certainly couldn’t let Tolkien be appropriated in the service of misguided view of technological neutrality.

Teachers, Resistance is Futile

Related to the last post, a friend passed along a link this afternoon to a story in the NY Times about a school district in North Carolina that is having notable success implementing technology in the classroom. Here’s a representative passage:

“Mooresville’s laptops perform the same tasks as those in hundreds of other districts: they correct worksheets, assemble progress data for teachers, allow for compelling multimedia lessons, and let students work at their own pace or in groups, rather than all listening to one teacher. The difference, teachers and administrators here said, is that they value computers not for the newest content they can deliver, but for how they tap into the oldest of student emotions — curiosity, boredom, embarrassment, angst — and help educators deliver what only people can.

Many classrooms have moved from lecture to lattice, where students collaborate in small groups with the teacher swooping in for consultation. Rather than tell her 11th-grade English students the definition of transcendentalism one recent day, Katheryn Higgins had them crowd-source their own — quite Thoreauly, it turned out — using Google Docs. Back in September, Ms. Higgins had the more outgoing students make presentations on the Declaration of Independence, while shy ones discussed it in an online chat room, which she monitored.”

Yes, you read that correctly. He did write “quite Thoreauly.” As unfortunate as that line may be, it’s not the most disturbing:

“Many students adapted to the overhaul more easily than their teachers, some of whom resented having beloved tools — scripted lectures, printed textbooks and a predictable flow through the curriculum — vanish. The layoffs in 2009 and 2010, of about 10 percent of the district’s teachers, helped weed out the most reluctant, Mr. Edwards said; others he was able to convince that the technology would actually allow for more personal and enjoyable interaction with students.”

Once more, the layoffs “helped weed out the most reluctant.”

I’m far from suggesting that there is never a time to let go of incompetent teachers, but it seems to me that this is a net that is just as likely to snare competent teachers as incompetent ones.

The message in this case seems clear: resistance is futile, I believe is the line. It rather reminds me of the motto of the 1933 Century of Progress World’s Fair which I stumbled upon recently — “Science Finds, Industry Applies, Man Conforms”.

So it would seem, at least as this writer presents the case.