David Foster Wallace on Life, Literature, and Writing

I’ve been reading Although Of Course You End Up Becoming Yourself: A Road Trip with David Foster Wallace, a book that amounts to a running transcript of David Lipsky’s five days with Wallace back in the mid-1990s during a book tour for Wallace’s then recently released Infinite Jest.  I’ve not read anything by Wallace leading up to this, but I had been drawn to his personality by the numerous mentions of Wallace I’d come across over the last year or so.  I’ve not been disappointed.  It has been an odd thing to feel a deep sadness for the loss of a person you’ve never met, or, in a certain sense, only just met, overheard really.  Reading this record of Lipsky’s time with Wallace, you feel as though you were eavesdropping, and the conversation is so engaging that you can’t quite walk away.  Wallace thus far comes across as a remarkably sensitive, intelligent, and kind individual with genuine insight into what it feels like to be alive.

I wanted to excerpt a passage or two that I thought spoke to some of the issues that I’ve written about here, particularly the sense of being overwhelmed by stimulation and distraction or the fractured, alienated feel of contemporary life.  Remember as you read the selections that this is a transcript of a conversation and so it will not have the polish of prose.  But it makes up for the lack of polish with a certain immediacy and affect that I thought was compelling.

We pick up Wallace discussing traditional narrative with Lipsky.  Lipsky has suggested that literature in the mold of Leo Tolstoy does the best job of capturing the reality of life.  Wallace disagrees:

And I don’t know about you.  I just — stuff that’s like that, I enjoy reading, but it doesn’t feel true at all.  I read it as a relief from what’s true.  I read it as a relief from the fact that, I received five hundred thousand discrete bits of information today, of which maybe twenty-five are important.  And how am I going to sort those out, you know?

Lipsky is not sold, he remarks that he is more taken by the continuity of life, rather than the discontinuity.  Wallace continues:

Huh.  Well you and I just disagree.  Maybe the world just feels differently to us.  This is all going back to something that isn’t really clear:  that avant-garde stuff is hard to read.  I’m not defending it, I’m saying that stuff — this is gonna get very abstract — but there’s a certain set of magical stuff that fiction can do for us.  There’s maybe thirteen things, of which who even knows which ones we can talk about.  But one of them has to do with the sense of, the sense of capturing, capturing what the world feels like to us, in the sort of way that I think that a reader can tell “Another sensibility like mine exists.”  Something else feels this way to someone else.  So that the reader feels less lonely.

There’s really really shitty avant-garde, that’s coy and hard for its own sake.  That I don’t think it’s a big accident that a lot of what, if you look at the history of fiction — sort of, like, if you look at the history of painting after the development of the photography — that the history of fiction represents this continuing struggle to allow fiction to continue to do that magical stuff.  As the texture, as the cognitive texture, of our lives changes.  And as, um, as the different media by which our lives are represented change.  And it’s the avant-garde or experimental stuff that has the chance to move the stuff along.  And that’s what’s precious about it.

And the reason why I’m angry at how shitty most of it is, and how much it ignores the reader, is that I think it’s very very very very precious.  Because it’s the stuff that’s about what it feels like to live.  Instead of being a relief from what it feels like to live.

That struck me as a rather astute analysis of the power of literature and the potential (and pitfalls) of the avant-garde.  I was especially struck by his contrast between traditional narrative in Tolstoy’s mold which Wallace experiences as a relief from what it feels like to live, and disjointed, discontinuous contemporary literature that reflects what it feels like to live.

Wallace goes on to describe once again the feel of life that contemporary fiction should attempt to capture.  Keep in mind that although this dialog feels fresh and contemporary it is over 15 years old and was spoken at the dawn of the Internet age.

… I think a lot of people feel — not overwhelmed by the amount of stuff they have to do.  But overwhelmed by the number of choices they have, and by the number of discrete, different things that come at them.  And the number of small … that since they’re part of numerous systems, the number of small insistent tugs on them, from a number of different systems and directions.  Whether that’s qualitatively different than the way life was for let’s say our parents or our grandparents, I’m not sure.  But I sorta think so.  At least in some way — in terms of the way it feels on your nerve endings.

Finally, here is a brief comment on the privilege and responsibility of the writer which gives us a sense of Wallace’s striking confidence in and respect for the reader:

What writers have is a license and also the freedom to sit — to sit, clench their fists, and make themselves be excruciatingly aware of the stuff that we’re mostly aware of only on a certain level.  And that if the writer does his job right, what he basically does is remind the reader of how smart the reader is.  Is to wake the reader up to stuff that the reader’s been aware of all the time.

“If they’re going to be rude, I’ll be rude right back”

“Once more unto the breach, dear friends, once more,” or so David Carr seemed to say.  In his latest NY Times op-ed, “Keep Your Thumbs Still When I’m Talking To You,” Carr rallies the troops once more to the cause of civility in a digital age.  The battle has been raging for some time and some might think it already lost, but Carr brings us rumors from deep within the enemy’s citadel suggesting that even there the tide may be turning.

Alright, so the martial metaphor may be a bit overdone, but that is more or less how I experienced Carr’s piece.  On more than a few occasions over the last several months, I’ve written on the same theme and advocated an approach to digital media which preserves the dignity of the human person, particularly the in-the-flesh human persons in front of us.  I’ve done so mainly by seconding the work of Jaron Lanier (here and here), Rochelle Gurstein, and Gary Shteyngart among others. But it has been awhile since I’ve addressed the matter, and I have to confess, there is a certain complacency that starts to set in.  One begins to question whether it is even worth the effort.  Perversely, one even begins to feel that it would be rude to point out the rudeness of those who will not give another human being their undivided attention or manage to take their calls out of public earshot.  Carr, however, gives the cause of civility one more public shot in the arm.

He begins his piece straightforwardly, “Add one more achievement to the digital revolution: It has made it fashionable to be rude”  Nothing new here, of course.  Carr’s anecdotal evidence is drawn from his recent experience at South by Southwest Interactive.  Here he witnesses all sorts of activities that would neatly fall into the category aptly described in the title of Sherry Turkle’s recent book, Alone Together.  It is the usual sort of thing, people distractedly gazing at their smart phones and tablets whether in the audience, waiting in line, or even participating on a panel.  In this particular piece, my favorite moment of bemused recognition came from Entertainment Weekly’s Anthony Breznican describing what happens after one person excuses themselves to check their phone at the dinner table,

“Instead of continuing with the conversation, we all take out our phones and check them in earnest,” he said. “For a few minutes everybody is typing away. A silence falls over the group and we all engage in a mass thumb-wrestling competition between man and little machine. Then the moment passes, the BlackBerrys and iPhones are reholstered, and we return to being humans again after a brief trance.”

Yet, there were also signs of awakening.  For example, in response to his own presentation, “I’m So Productive, I Never Get Anything Done,” Carr reports that,

The biggest reaction in the session by far came when Anthony De Rosa, a product manager and programmer at Reuters and a big presence on Twitter and Tumblr, said that mobile connectedness has eroded fundamental human courtesies.

“When people are out and they’re among other people they need to just put everything down,” he said. “It’s fine when you’re at home or at work when you’re distracted by things, but we need to give that respect to each other back.”

His words brought sudden and tumultuous applause. It was sort of a moment, given that we were sitting amid some of the most digitally devoted people in the hemisphere. Perhaps somewhere on the way to the merger of the online and offline world, we had all stepped across a line without knowing it.

Perhaps.

Lest we get too encouraged, Carr also tells of the earnest young man that came up to him after the talk to affirm the importance of “actual connection” while “casting sidelong glances at his iPhone while we talked.”  Carr is almost certainly right when he suggests that the young man didn’t even realize what he was doing.  The behavior is more or less habitual and thus just below the level of conscious awareness.

For some, however, the behavior is, in fact, quite conscious.  Carr mentions MG Siegler’s TechCrunch essay entitled “I Will Check My Phone at Dinner and You Will Deal With It.”  “This is the way the world works now,” Seigler brusquely informs us,  “We’re always connected and always on call. And some of us prefer it that way.”   Those are fighting words. They are also words that almost invoke a wearied resignation on the part of those who, in fact, don’t prefer it that way.  This is the force of the rhetoric of inevitability:  hear it and repeat it often enough and you start believing it.

Moreover, Carr notes that

… there is also a specific kind of narcissism that the social Web engenders.  By grooming and updating your various avatars, you are making sure you remain at the popular kid’s table. One of the more seductive data points in real-time media is what people think of you. The metrics of followers and retweets beget a kind of always-on day trading in the unstable currency of the self.

That is nicely crafted and incisive metaphor right at the end, and, to borrow a line from David Foster Wallace, it all amounts to getting fed “food pellets of praise.”  I would go so far as to speculate that the issue is neurological and I’m sure someone out there can refer me to a study that suggests a link between our social media interactions and the release of dopamine in the brain, or something along those lines.  (Here’s a start: “All those tweets, apps, updates may drain brain.”)

In any case, the lines are drawn once more.  The martial metaphors are in a sense already suggested by Carr in his closing lines, drawing on the observations of William Powers, he writes

And therein lies the real problem. When someone you are trying to talk to ends up getting busy on a phone, the most natural response is not to scold, but to emulate. It’s mutually assured distraction.

Mutually assured distraction of course alludes to the Cold War-era doctrine of mutually assured destruction.  There may be more overlap between the two than even Carr intended, both are a form of madness in their own way.  And perhaps it is time for more aggressive tactics.  De Rosa, cited above, also wrote Carr the following:

I’m fine with people stepping aside to check something, but when I’m standing in front of someone and in the middle of my conversation they whip out their phone, I’ll just stop talking to them and walk away. If they’re going to be rude, I’ll be rude right back.

“Once more unto the breach …”

Pattern Recognition: The Genius of our Time?

What counts for genius in our times?  Is it the same as what has always counted for genius?  Or, are there shifting criteria that reflect the priorities and affordances of a particular age?

Mary Carruthers  opens The Book of Memory, her study of  memory in medieval culture, with a contrast between Thomas Aquinas and Albert Einstein.  Both were regarded as the outstanding intellects of their era; each elicited enthusiastic, wonder-struck praise from his contemporaries.  Carruthers cites a letter by an associate of each man as typical of the praise that each received.  Summing up both she writes:

Of Einstein: ingenuity, intricate reasoning, originality, imagination, essentially new ideas couples with the notion that to achieve truth one must err of necessity, deep devotion to and understanding of physics, obstinacy, vital force, single-minded concentration, solitude.  Of Thomas Aquinas: subtlety and brilliance of intellect, original discoveries coupled with deep understanding of Scripture, memory, nothing forgotten and knowledge ever-increasing, special grace, inward recourse, single-minded concentration, intense recollection, solitude.

Carruthers goes on to note how similar the lists of qualities are “in terms of what they needed for their compositional activity (activity of thought), the social isolation required by each individual, and what is perceived to be the remarkable subtlety, originality, and understanding of the product of such reasoning.”  The difference, appropriate to the object of Carruther’s study, lies in the relationship between memory and the imagination.

Carruthers is eager to remind us that “human beings did not suddenly acquire imagination and intuition with Coleridge, having previously been poor clods.”  But there is a difference in the way these qualities were understood:

The difference is that whereas now geniuses are said to have creative imagination which they express in intricate reasoning and original discovery, in earlier times they were said to have richly retentive memories, which they expressed in intricate reasoning and original discovery.

This latter perspective, the earlier Medieval perspective, is not too far removed from the connections between memory and creativity drawn by Jim Holt based on the experiences of French mathematician, Henri Poincare. We might also note that the changing status of memory within the ecology of genius is owed at least in part to the evolution of technologies which supplement the memory.  Aquinas, working in a culture for which books were still relatively scarce, would have needed a remarkably retentive memory to continue working with the knowledge he acquired through reading.  This becomes less of a priority for post-Gutenberg society.

Mostly, however, Carruthers’ comparison suggested to me the question of what might count for genius in our own time.  We are not nearly so far removed from Einstein as Einstein was from Aquinas, but a good deal has changed nonetheless which makes the question at least plausible.  I suspect that, as was the case between Aquinas and Einstein, there will be a good deal of continuity, a kind of base-line of genius perhaps.  But that baseline makes the shifts in emphasis all the more telling.

I don’t have a particular model for contemporary genius in mind, so this is entirely speculative, but I wonder if today, or in coming years, we might not transfer some of the wonder previously elicited by memory and imagination to something like rapid pattern recognition.  I realize there is significant overlap within these categories.  Just as memory and imagination are related in important ways, so pattern recognition is also implicit in both and has always been an important ability.  So again, it is a matter of emphasis.  But it seems to me that the ability to rapidly recognize, or even generate meaningful patterns from an undifferentiated flow of information may be the characteristic of intelligence most suited to our times.

In Aquinas’ day the emphasis was on the memory needed in order to retain the knowledge which was relatively scarce. In Einstein’s time the emphasis was on the ability to jump out of established patterns of thought generated by abundant, but sometimes static knowledge.  In our day, we are overwhelmed by a torrent of easily available and ever shifting information, we won’t quite say knowledge.  Under these conditions memory loses its pride of place, as does perhaps imagination.  However, the ability to draw together disparate pieces of information or to connect seemingly unrelated points of data into a meaningful pattern that we might count as knowledge now becomes a dimension of human intelligence that may inspire comparable awe and admiration from culture drowning in noise.

Perhaps an analogy to wrap up:  Think of the constellations as instances of pattern recognition.  Lot’s of data points against the night sky drawn into patterns that are meaningful, useful, and beautiful to human beings.  For Aquinas the stars of knowledge might appear but for a moment and to recognize the pattern he had to hold in memory their location as he learned and remembered the location of other stars.  For Einstein many more stars had appeared and they remained steadily in his intellectual field of vision, seeing new patterns were old ones had been established was his challenge.

Today we might say that the night sky is not only full to overflowing, but the configuration is constantly shifting.  Our task is not necessarily to remember the location of few but fading stars, nor is it to see new patterns in a fuller but steady field.  It is to constantly recognize new and possibly temporary patterns in a full and flickering field of information. Those who are able to do this most effectively may garner the kind of respect earlier reserved for the memory of Aquinas and the imagination of Einstein.

For a different, but I think related take on a new form of thinking for our age that draws on the imagery of constellations I encourage you to take a look at this thread at Snark Market.

iSpirituality: Religous Apps and Spiritual Practices

Religious apps for the iPhone and iPad have been in the news lately.  In “Religion on Your iPhone?”, Lisa Fernandez discusses a variety of apps created for Christians, Muslims, Jews, Hindus, and Buddhists.  The Apple app store is, if nothing else, an apparently ecumenical space.  Among the various religious apps, however, “Confession: A Roman Catholic App” has probably received the most attention and a good deal of it seemingly misguided.  The folks at Get Religion have broken down some of the misleading news stories related to the app and the Catholic League collected a few of the offending headlines including:

• “Can’t Make it to Confession? There’s an App for That”
• “Catholic Church Approves Confession by iPhone”
• “Bless Me iPhone for I Have Sinned”
• “Catholic Church Endorses App for Sinning iPhone Users”
• “Forgiveness via iPhone: Church Approves Confession App”
• “New, Church-Approved iPhone Offers Confession On the Go”
• “Confess Your Sins to a Phone in Catholic Church Endorsed App”
• “Catholics Can Now Confess Using iPhone App”

Bottom line: the app is intended to help prepare for confession and is not intended to substitute for face-to-face confession.  There is no virtual priest, and there is no virtual absolution.  As Terry Mattingly put it at Get Religion,

This app is actually a combination between a personal diary and the “examination of conscience” booklets and tracts that Catholic and Orthodox Christians have carried in their pockets, wallets and purses for generations.

You may also want to take a look at Maureen Dowd’s rather snarky take on the Confession app in her NY Times column, “Forgive Me, Father, For I Have Linked.”

Click image to see WSJ video report

The Wall Street Journal has also recently posted a video report on religious apps:  “From apps that let you tweet Bible verses to those that help you face Mecca or pray the right Hebrew blessings with the right foods, some of the pious are embracing mobile technology.”  The story follows the usual pattern:  new thing > positive reaction to new thing > negative reaction to new thing > conclusion offering moderating position.  Concerns, voiced mainly by a Christian pastor, include the danger of disengaging from the face-to-face community and misdirecting the focus of religious experience onto the device and away from God.

Professor Rachel Wagner, author of the forthcoming “Godwired: Religion, Ritual and Virtual Reality,” also appears in the report and frames the issue as a struggle between relevance to contemporary culture and faithfulness to ancient traditions.  She suggests that what is at issue is the degree of interactivity with the ritual or practice that the apps allow.  As she puts it, “Those religious groups that want to stay true to their traditions are going to allow less wiggle room.”  It’s not entirely clear from the segment what exactly Wagner means by interactivity, but I suspect she has in view the flexibility of the rituals.  In other words, interactivity implies that ancient rituals may be reshaped by their re-presentation in new media.

Putting the issue this way recalls Paul Connerton’s thesis in How Societies Remember.  In Connerton’s analysis,

Both commemorative ceremonies and bodily practices therefore contain a measure of insurance against the process of cumulative questioning entailed in all discursive practices.  This is the source of their importance and persistence as mnemonic systems.  Every group, then, will entrust to bodily automatisms the values and categories which they are most anxious to conserve.  They will know how well the past can be kept in mind by a habitual memory sedimented in the body.

In other words, embodied practices or rituals represent the most durable mode of remembering.  This is in part because they are less likely to be questioned and altered than knowledge encoded in spoken or written texts.  The core of a tradition’s identity then is wrapped up in its rituals and embodied practices; changes to the rituals and practices effect changes to collective memory and identity.

Consider, for example, that while the Reformation clearly involved the reformulation of key doctrines, it also restructured the embodied rituals of Catholic practice and re-ordered the material conditions of worship.  Bodily habits such as crossing oneself and material conditions such as the architecture of churches changed as much as doctrinal standards.  I suspect one could argue convincingly that for laymen and women, the changes in embodied practice and material conditions of worship were more significant than abstract doctrinal reformulations.

Anecdotally, I vividly recall some years ago being in a certain Protestant context and witnessing a young boy being pulled up rather brusquely from a kneeling posture during prayer with the very straightforward admonition, “We don’t do that here!”  It apparently smacked of Catholicism.  A particular vision of the faith was thereby inculcated by regulating the body.

With this in mind, then, the most interesting thing about religious apps may not be their content, but the way that they insert themselves into the embodied experience of worship and religious practice.  This may occur through the use of a cell phone to access the apps during worship.  (Remember how easy it is to spot someone who is being attentive to their cell phones by simply observing their posture.)  It may also occur through the way an app repackages a ritual or practice for digital mediation, perhaps abstracting bodily elements while preserving more mental components.  In either case, religious apps are likely leave their mark by subtly reshaping the way the body engages in worship and spiritual practice.

Technology, Men, and War

We are still learning more about World War II, and much of it is rather depressing.  Recently German researchers have published the transcripts of conversations among German soldiers held as POW’s.  The soldiers and airmen were secretly recorded by the Allies and their conversations offer a disturbingly honest glimpse of the war from the soldier’s perspective.  You can read about the transcripts in a Der Spiegel Online article, “Nazi War Crimes Described by German Soldiers.” Here is an excerpt that caught my attention:

Men love technology, a subject that enables them to quickly find common ground. Many of the conversations revolve around equipment, weapons, calibers and many variations on how the men “whacked,” “picked off” or “took out” other human beings.

The victim is merely the target, to be shot and destroyed — be it a ship, a building, a train or even a cyclist, a pedestrian or a woman pushing a baby carriage. Only in very few cases do the soldiers show remorse over the fate of innocent civilians, while empathy is almost completely absent from their conversations. “The victim in an empathic sense doesn’t appear in the accounts,” the authors conclude.

Be advised the content of the article is at points graphic and disturbing.