A Frictionless Life Is Also A Life Without Traction

The idea of a “frictionless” life has been on mind since March when I read (and posted) Chetan Sharma’s vision of a pervasively wired and connected world.  Speaking of the benefits he sees accruing to the realization of his vision, Sharma explains,

This is where it needs to go and will go in 10 years, making everyday experiences much better and friction free. If a person has a desire to learn or shop or engage in social interaction, it’s right there. Beyond just doing things on televisions and cell phones, you’ll be able to do these things on a wall anywhere. It’s about reducing friction. You can accomplish any given task today with 50 different steps but this future of connected devices is all about making things much easier.

Regardless of what we might think about the desirability of such a world, it did appear to me that Sharma had given us a compelling metaphor for the world we desire:  friction free.  After all, who would oppose “making things much easier.”  The question, however, may be whether “making things much easier” necessarily makes “everyday experience much better.”  As the metaphor flitted in and out of thought over the last few weeks, it eventually occurred to me that the metaphor invited a certain extension that may begin to answer that question: A frictionless life is also a life without traction.

At the expense of letting metaphors run wild, I want to pursue this formulation and suggest that not all forms of friction are undesirable.  Certain forms of friction are absolutely essential to acting and maneuvering in the world.  A frictionless world would also be one in which we would feel ourselves unanchored and afloat, perpetually propelled by forces we have no power to resist.  That seems already to describe the feel of everyday life for many.  This can be exhilarating for a time, but I suspect that eventually we desire a greater degree of stability and agency — traction.

Removing all resistance removes all traction; and if everything is easy, in the end nothing may be satisfying or meaningful.  This reminded me of philosopher of technology Albert Borgmann’s suggestion that we must make a distinction between “trouble we reject in principle and accept in practice and trouble we accept in practice and in principle.” When you begin talking about accepting trouble it becomes important to make clear distinctions.  In the former category of trouble we accept in practice but reject in principle, Borgmann has in mind troubles on the order of car accidents and cancer.  By “accepting them in practice,” Borgmann only means that at the personal level we must in our own way discover a means of coping with such tragedies.   But these are troubles that we oppose in principle and so we seek cures for cancer and improved highway safety.

Against these, Borgmann opposes troubles that we also accept in practice, but ought to accept on principle as well.  Here the examples are preparing a meal (including growing the ingredients) and hiking a mountain.  These sorts of troubles, not without their real dangers, could be opposed in principle — never prepare meals at home, never hike — but such avoidance would also prevent us from experiencing their attendant joys and satisfactions.  If we seek to remove all trouble or resistance from our lives; if we always opt for convenience, efficiency, and ease; if, in other words, we aim indiscriminately at the frictionless life; then we will simultaneously rob ourselves of the real satisfactions and pleasures that enhance and enrich our lives.

Traction implies resistance and sometimes trouble, but it also presents us with the opportunity to navigate meaningfully.  A frictionless life may promise ease and a certain security, but it also leaves us adrift, chasing one superficial pleasure after another; never satisfied, because we never experience the struggle against resistance that is essential to a sense of accomplishment.  The trajectory of our desire toward a frictionless life, then, may paradoxically leave us unable to find meaningful satisfaction or a sense of fulfillment.  Trading some friction for traction, however, may, in fact, make “everyday experience much better.”

Twitterfication: More Complicated + Less New = No Interest

Seismic Acitivty or Media Coverage

On the Media’s recent program,  Turning Away, focused on the spike in foreign news coverage following the devastation in Japan and the combat in Libya.  That spike, however, plateaued, and now foreign coverage in American journalism is again on the decline.  At least until the next crisis, of course.

This prompted some incisive, if somewhat disconcerting, observations from host Brooke Gladstone and her guests, Mark Jurkowitz and Steve Coll.  Here is Gladstone introducing the program:

BROOKE GLADSTONE: Mark Jurkowitz at the PEW Research Center’s Project for Excellence in Journalism says that a few weeks back Libya and Japan made up more than 40 percent of the news, an extraordinary number. But now, even as fresh horrors rain down on the people of Libya and Japan, the American media look elsewhere for leads.

Perhaps, says Jurkowitz, that’s because events out there have become both more complicated and less new, a lethal combination for coverage . . .

That last line struck me as being regrettably accurate.  More Complicated + Less New = Less Coverage.  And less coverage either reflects or engenders no interest.  I’m fairly certain that this equation has summed up the way American media works for some time time now; Kierkegaard had already diagnosed the symptoms in the 19th century.  But I would also speculate that the dynamics of digital/social media have also ratcheted up the logic the equation seeks to convey, exponentially perhaps.  Consider it the Twitterfication of the news cycle.  We can’t quite do complicated and sustained very well within the constraints of social media.

The following exchange also provided a helpful schema that rang true, the 12-day disaster editorial cycle:

BROOKE GLADSTONE: Steve Coll covered his fair share of natural disaster and war in his decades as foreign correspondent at The Washington Post, and he found that there is a template for many stories, no matter how harrowing. In his experience, earthquake and disaster coverage, in general, follow a 12-day editorial cycle. He witnessed it while covering an earthquake that killed tens of thousands of people in Iran.

The first few days are spent reporting breaking news and casualties and destruction. Around day five, the late miracle story in which search teams find an improbable survivor amidst the rubble. Day seven brings the interpretation of meaning story, with religious overtones. By day 12, it’s essentially buh-bye for now.

So in your mind run through the catastrophes and crisis that have garnered significant media coverage over the last year or so and see if that does not neatly capture the way they were covered.  Wait, having a hard time remembering the catastrophes and crisis of the last year?  Were you caught off guard, as I was, when we heard that it had been a year since the BP oil spill in the gulf?  Vaguely remember something about floods in Australia? Something happened in Tunisia recently right?  It seems the logic of our media environment is precisely calibrated to induce forgetfulness.

After Coll expresses some surprise at how quickly we have lost sight of ongoing developments in Japan and Libya, Gladstone asks Coll, “Should we be worried about that?”

Coll is, perhaps justifiably, sardonic in response:

STEVE COLL: Well, we are a global power with military and diplomatic interests and deployments all over the world, and we expend tax dollars and put lives at risk all the time in complicated foreign environments, so yeah, it’s a problem. We ought to be thinking about these places on an empirical basis in greater depth than we sometimes do.

David Foster Wallace on Life, Literature, and Writing

I’ve been reading Although Of Course You End Up Becoming Yourself: A Road Trip with David Foster Wallace, a book that amounts to a running transcript of David Lipsky’s five days with Wallace back in the mid-1990s during a book tour for Wallace’s then recently released Infinite Jest.  I’ve not read anything by Wallace leading up to this, but I had been drawn to his personality by the numerous mentions of Wallace I’d come across over the last year or so.  I’ve not been disappointed.  It has been an odd thing to feel a deep sadness for the loss of a person you’ve never met, or, in a certain sense, only just met, overheard really.  Reading this record of Lipsky’s time with Wallace, you feel as though you were eavesdropping, and the conversation is so engaging that you can’t quite walk away.  Wallace thus far comes across as a remarkably sensitive, intelligent, and kind individual with genuine insight into what it feels like to be alive.

I wanted to excerpt a passage or two that I thought spoke to some of the issues that I’ve written about here, particularly the sense of being overwhelmed by stimulation and distraction or the fractured, alienated feel of contemporary life.  Remember as you read the selections that this is a transcript of a conversation and so it will not have the polish of prose.  But it makes up for the lack of polish with a certain immediacy and affect that I thought was compelling.

We pick up Wallace discussing traditional narrative with Lipsky.  Lipsky has suggested that literature in the mold of Leo Tolstoy does the best job of capturing the reality of life.  Wallace disagrees:

And I don’t know about you.  I just — stuff that’s like that, I enjoy reading, but it doesn’t feel true at all.  I read it as a relief from what’s true.  I read it as a relief from the fact that, I received five hundred thousand discrete bits of information today, of which maybe twenty-five are important.  And how am I going to sort those out, you know?

Lipsky is not sold, he remarks that he is more taken by the continuity of life, rather than the discontinuity.  Wallace continues:

Huh.  Well you and I just disagree.  Maybe the world just feels differently to us.  This is all going back to something that isn’t really clear:  that avant-garde stuff is hard to read.  I’m not defending it, I’m saying that stuff — this is gonna get very abstract — but there’s a certain set of magical stuff that fiction can do for us.  There’s maybe thirteen things, of which who even knows which ones we can talk about.  But one of them has to do with the sense of, the sense of capturing, capturing what the world feels like to us, in the sort of way that I think that a reader can tell “Another sensibility like mine exists.”  Something else feels this way to someone else.  So that the reader feels less lonely.

There’s really really shitty avant-garde, that’s coy and hard for its own sake.  That I don’t think it’s a big accident that a lot of what, if you look at the history of fiction — sort of, like, if you look at the history of painting after the development of the photography — that the history of fiction represents this continuing struggle to allow fiction to continue to do that magical stuff.  As the texture, as the cognitive texture, of our lives changes.  And as, um, as the different media by which our lives are represented change.  And it’s the avant-garde or experimental stuff that has the chance to move the stuff along.  And that’s what’s precious about it.

And the reason why I’m angry at how shitty most of it is, and how much it ignores the reader, is that I think it’s very very very very precious.  Because it’s the stuff that’s about what it feels like to live.  Instead of being a relief from what it feels like to live.

That struck me as a rather astute analysis of the power of literature and the potential (and pitfalls) of the avant-garde.  I was especially struck by his contrast between traditional narrative in Tolstoy’s mold which Wallace experiences as a relief from what it feels like to live, and disjointed, discontinuous contemporary literature that reflects what it feels like to live.

Wallace goes on to describe once again the feel of life that contemporary fiction should attempt to capture.  Keep in mind that although this dialog feels fresh and contemporary it is over 15 years old and was spoken at the dawn of the Internet age.

… I think a lot of people feel — not overwhelmed by the amount of stuff they have to do.  But overwhelmed by the number of choices they have, and by the number of discrete, different things that come at them.  And the number of small … that since they’re part of numerous systems, the number of small insistent tugs on them, from a number of different systems and directions.  Whether that’s qualitatively different than the way life was for let’s say our parents or our grandparents, I’m not sure.  But I sorta think so.  At least in some way — in terms of the way it feels on your nerve endings.

Finally, here is a brief comment on the privilege and responsibility of the writer which gives us a sense of Wallace’s striking confidence in and respect for the reader:

What writers have is a license and also the freedom to sit — to sit, clench their fists, and make themselves be excruciatingly aware of the stuff that we’re mostly aware of only on a certain level.  And that if the writer does his job right, what he basically does is remind the reader of how smart the reader is.  Is to wake the reader up to stuff that the reader’s been aware of all the time.

“If they’re going to be rude, I’ll be rude right back”

“Once more unto the breach, dear friends, once more,” or so David Carr seemed to say.  In his latest NY Times op-ed, “Keep Your Thumbs Still When I’m Talking To You,” Carr rallies the troops once more to the cause of civility in a digital age.  The battle has been raging for some time and some might think it already lost, but Carr brings us rumors from deep within the enemy’s citadel suggesting that even there the tide may be turning.

Alright, so the martial metaphor may be a bit overdone, but that is more or less how I experienced Carr’s piece.  On more than a few occasions over the last several months, I’ve written on the same theme and advocated an approach to digital media which preserves the dignity of the human person, particularly the in-the-flesh human persons in front of us.  I’ve done so mainly by seconding the work of Jaron Lanier (here and here), Rochelle Gurstein, and Gary Shteyngart among others. But it has been awhile since I’ve addressed the matter, and I have to confess, there is a certain complacency that starts to set in.  One begins to question whether it is even worth the effort.  Perversely, one even begins to feel that it would be rude to point out the rudeness of those who will not give another human being their undivided attention or manage to take their calls out of public earshot.  Carr, however, gives the cause of civility one more public shot in the arm.

He begins his piece straightforwardly, “Add one more achievement to the digital revolution: It has made it fashionable to be rude”  Nothing new here, of course.  Carr’s anecdotal evidence is drawn from his recent experience at South by Southwest Interactive.  Here he witnesses all sorts of activities that would neatly fall into the category aptly described in the title of Sherry Turkle’s recent book, Alone Together.  It is the usual sort of thing, people distractedly gazing at their smart phones and tablets whether in the audience, waiting in line, or even participating on a panel.  In this particular piece, my favorite moment of bemused recognition came from Entertainment Weekly’s Anthony Breznican describing what happens after one person excuses themselves to check their phone at the dinner table,

“Instead of continuing with the conversation, we all take out our phones and check them in earnest,” he said. “For a few minutes everybody is typing away. A silence falls over the group and we all engage in a mass thumb-wrestling competition between man and little machine. Then the moment passes, the BlackBerrys and iPhones are reholstered, and we return to being humans again after a brief trance.”

Yet, there were also signs of awakening.  For example, in response to his own presentation, “I’m So Productive, I Never Get Anything Done,” Carr reports that,

The biggest reaction in the session by far came when Anthony De Rosa, a product manager and programmer at Reuters and a big presence on Twitter and Tumblr, said that mobile connectedness has eroded fundamental human courtesies.

“When people are out and they’re among other people they need to just put everything down,” he said. “It’s fine when you’re at home or at work when you’re distracted by things, but we need to give that respect to each other back.”

His words brought sudden and tumultuous applause. It was sort of a moment, given that we were sitting amid some of the most digitally devoted people in the hemisphere. Perhaps somewhere on the way to the merger of the online and offline world, we had all stepped across a line without knowing it.

Perhaps.

Lest we get too encouraged, Carr also tells of the earnest young man that came up to him after the talk to affirm the importance of “actual connection” while “casting sidelong glances at his iPhone while we talked.”  Carr is almost certainly right when he suggests that the young man didn’t even realize what he was doing.  The behavior is more or less habitual and thus just below the level of conscious awareness.

For some, however, the behavior is, in fact, quite conscious.  Carr mentions MG Siegler’s TechCrunch essay entitled “I Will Check My Phone at Dinner and You Will Deal With It.”  “This is the way the world works now,” Seigler brusquely informs us,  “We’re always connected and always on call. And some of us prefer it that way.”   Those are fighting words. They are also words that almost invoke a wearied resignation on the part of those who, in fact, don’t prefer it that way.  This is the force of the rhetoric of inevitability:  hear it and repeat it often enough and you start believing it.

Moreover, Carr notes that

… there is also a specific kind of narcissism that the social Web engenders.  By grooming and updating your various avatars, you are making sure you remain at the popular kid’s table. One of the more seductive data points in real-time media is what people think of you. The metrics of followers and retweets beget a kind of always-on day trading in the unstable currency of the self.

That is nicely crafted and incisive metaphor right at the end, and, to borrow a line from David Foster Wallace, it all amounts to getting fed “food pellets of praise.”  I would go so far as to speculate that the issue is neurological and I’m sure someone out there can refer me to a study that suggests a link between our social media interactions and the release of dopamine in the brain, or something along those lines.  (Here’s a start: “All those tweets, apps, updates may drain brain.”)

In any case, the lines are drawn once more.  The martial metaphors are in a sense already suggested by Carr in his closing lines, drawing on the observations of William Powers, he writes

And therein lies the real problem. When someone you are trying to talk to ends up getting busy on a phone, the most natural response is not to scold, but to emulate. It’s mutually assured distraction.

Mutually assured distraction of course alludes to the Cold War-era doctrine of mutually assured destruction.  There may be more overlap between the two than even Carr intended, both are a form of madness in their own way.  And perhaps it is time for more aggressive tactics.  De Rosa, cited above, also wrote Carr the following:

I’m fine with people stepping aside to check something, but when I’m standing in front of someone and in the middle of my conversation they whip out their phone, I’ll just stop talking to them and walk away. If they’re going to be rude, I’ll be rude right back.

“Once more unto the breach …”

Pattern Recognition: The Genius of our Time?

What counts for genius in our times?  Is it the same as what has always counted for genius?  Or, are there shifting criteria that reflect the priorities and affordances of a particular age?

Mary Carruthers  opens The Book of Memory, her study of  memory in medieval culture, with a contrast between Thomas Aquinas and Albert Einstein.  Both were regarded as the outstanding intellects of their era; each elicited enthusiastic, wonder-struck praise from his contemporaries.  Carruthers cites a letter by an associate of each man as typical of the praise that each received.  Summing up both she writes:

Of Einstein: ingenuity, intricate reasoning, originality, imagination, essentially new ideas couples with the notion that to achieve truth one must err of necessity, deep devotion to and understanding of physics, obstinacy, vital force, single-minded concentration, solitude.  Of Thomas Aquinas: subtlety and brilliance of intellect, original discoveries coupled with deep understanding of Scripture, memory, nothing forgotten and knowledge ever-increasing, special grace, inward recourse, single-minded concentration, intense recollection, solitude.

Carruthers goes on to note how similar the lists of qualities are “in terms of what they needed for their compositional activity (activity of thought), the social isolation required by each individual, and what is perceived to be the remarkable subtlety, originality, and understanding of the product of such reasoning.”  The difference, appropriate to the object of Carruther’s study, lies in the relationship between memory and the imagination.

Carruthers is eager to remind us that “human beings did not suddenly acquire imagination and intuition with Coleridge, having previously been poor clods.”  But there is a difference in the way these qualities were understood:

The difference is that whereas now geniuses are said to have creative imagination which they express in intricate reasoning and original discovery, in earlier times they were said to have richly retentive memories, which they expressed in intricate reasoning and original discovery.

This latter perspective, the earlier Medieval perspective, is not too far removed from the connections between memory and creativity drawn by Jim Holt based on the experiences of French mathematician, Henri Poincare. We might also note that the changing status of memory within the ecology of genius is owed at least in part to the evolution of technologies which supplement the memory.  Aquinas, working in a culture for which books were still relatively scarce, would have needed a remarkably retentive memory to continue working with the knowledge he acquired through reading.  This becomes less of a priority for post-Gutenberg society.

Mostly, however, Carruthers’ comparison suggested to me the question of what might count for genius in our own time.  We are not nearly so far removed from Einstein as Einstein was from Aquinas, but a good deal has changed nonetheless which makes the question at least plausible.  I suspect that, as was the case between Aquinas and Einstein, there will be a good deal of continuity, a kind of base-line of genius perhaps.  But that baseline makes the shifts in emphasis all the more telling.

I don’t have a particular model for contemporary genius in mind, so this is entirely speculative, but I wonder if today, or in coming years, we might not transfer some of the wonder previously elicited by memory and imagination to something like rapid pattern recognition.  I realize there is significant overlap within these categories.  Just as memory and imagination are related in important ways, so pattern recognition is also implicit in both and has always been an important ability.  So again, it is a matter of emphasis.  But it seems to me that the ability to rapidly recognize, or even generate meaningful patterns from an undifferentiated flow of information may be the characteristic of intelligence most suited to our times.

In Aquinas’ day the emphasis was on the memory needed in order to retain the knowledge which was relatively scarce. In Einstein’s time the emphasis was on the ability to jump out of established patterns of thought generated by abundant, but sometimes static knowledge.  In our day, we are overwhelmed by a torrent of easily available and ever shifting information, we won’t quite say knowledge.  Under these conditions memory loses its pride of place, as does perhaps imagination.  However, the ability to draw together disparate pieces of information or to connect seemingly unrelated points of data into a meaningful pattern that we might count as knowledge now becomes a dimension of human intelligence that may inspire comparable awe and admiration from culture drowning in noise.

Perhaps an analogy to wrap up:  Think of the constellations as instances of pattern recognition.  Lot’s of data points against the night sky drawn into patterns that are meaningful, useful, and beautiful to human beings.  For Aquinas the stars of knowledge might appear but for a moment and to recognize the pattern he had to hold in memory their location as he learned and remembered the location of other stars.  For Einstein many more stars had appeared and they remained steadily in his intellectual field of vision, seeing new patterns were old ones had been established was his challenge.

Today we might say that the night sky is not only full to overflowing, but the configuration is constantly shifting.  Our task is not necessarily to remember the location of few but fading stars, nor is it to see new patterns in a fuller but steady field.  It is to constantly recognize new and possibly temporary patterns in a full and flickering field of information. Those who are able to do this most effectively may garner the kind of respect earlier reserved for the memory of Aquinas and the imagination of Einstein.

For a different, but I think related take on a new form of thinking for our age that draws on the imagery of constellations I encourage you to take a look at this thread at Snark Market.