Don’t Worry, Be Idle

Our’s is an age of anxiety, at least it seems to feel that way to many.  Of course, this is far from an original observation.  Among the several works taking this phrase as a title is W. H. Auden’s post-war poem, The Age of Anxiety, first published as the world emerged from the shadows of war into the disconcerting light of the nuclear age.  Since then anxiety has settled in as a permanent feature of the American cultural and psychic landscape.

In a recent Slate interview, Taylor Clark, author of Nerve: Poise Under Pressure, Serenity Under Stress, and the Brave New Science of Fear and Cool, talked about anxiety in the United States:

Is the United States more prone to higher levels of anxiety than other nations?

Put simply, we are. Perhaps the most puzzling statistics are the ones that reveal that we’re significantly more anxious than countries in the developing world, many of which report only a fraction of the diagnosable cases of anxiety that we do. One of the reasons for this is that the people in many of these third-world nations are more accustomed to dealing with uncertainty and unpredictability. I talk about this a fair amount in the book, but lack of control is really the archenemy of anxiety. It’s its biggest trigger.

That explains the disparity in anxiety levels between the United States and the developing world, but why are we more anxious than, say, your average European nation?

It’s hard to pinpoint an answer, but I think Americans have become extremely vulnerable to the pressures of the 21st century. For the past 50 years, we’ve been getting progressively more anxious in good economic times and bad, so we can’t even blame it on the recession.

Clark goes on to suggest three factors contributing to our “deteriorating” psychic state:

  1. “The first is a simple matter of social disconnection. As we spend more time with our electronic devices than we do with our neighbors, we lose our physical sense of community. Social isolation flies in the face of our evolutionary history.”
  2. “The second major cause is the information overload that we’re experiencing with the Internet and the 24-hour media cycle. We’re all aware of it, but I’m not sure we realize how big an impact it’s having on our brains.”
  3. “The third explanation can be attributed to what one psychologist refers to as a culture of “feel goodism” — the idea that we shouldn’t ever have to be upset and that all our negative emotions can be neutralized with a pill. This to me feels like a distinctly American phenomenon.”

Clark seems to lay a good bit of the blame, if we may call it blame, for our anxiety on technology that paradoxically disconnects us and connects us too much.  This diagnosis will ring true and self-evident to some, but I suspect others will take issue, particularly for the exclusion of other significant social and cultural factors that predate the advent of digital media.  After all, social disconnection has a long history.  We may be talking much less on the phone, but Freud noted long ago that the wonder of being able to hear a loved one’s voice over the telephone was a slight salve for the condition that necessitated it to begin with, that is the loved one’s absence.

But regardless the causes, anxious we are, and quite clearly we are interested in doing something about it which quite often amounts to popping a pill.  While that may sometimes be a necessary and helpful remedy, I’d like to also pass along a prescription written by Sven Birkerts in his recent essay, “The Mother of Possibilities”: Idleness.

While admitting that Idleness, as he is envisioning it, is a difficult concept to pin down, Birkerts suggests that,

Idleness is what supervenes on those too few occasions when we allow our pace to slacken and merge with the rhythms of the natural day, when we manage to thwart the impulse to plan forward to the next thing and instead look—idly, with nascent curiosity—at what is immediately in front of us …

Birkerts leisurely traces the tradition of idleness from the Greeks to the much too harried present via pastoral and lyric poetry, Milton, Montaigne, the Romantics, Baudelaire, Proust, Benjamin, and Camus, to name a few.  It is a pleasant jaunt. Along the way we find Montaigne claiming, “It seemed to me that I could do my mind no greater favor than to allow it, in idleness, to entertain itself.”  And according to Birkerts,

Through the figure of the flâneur—via the writing of critic and philosopher Walter Benjamin—the idle state was given a platform, elevated from a species of indolence to something more like a cognitive stance, an ethos. Benjamin’s idea is basically that the true picture of things—certainly of urban experience—is perhaps best gathered from diverse, often seemingly tangential, perceptions, and that the dutiful, linear-thinking rationalist is less able to fathom the immensely complex reality around him than the untethered flâneur, who may very well take it by ambush.

Yet Idleness has its enemies, not least of which is a bad reputation to which Birkerts addresses himself early on, but also the pace of modernity, and yes, alas, distraction, which is not to be confused for Idleness proper.

The spaces and the physical movements of work and play are often nearly identical now, and our commerce with the world, our work life, is far more sedentary and cognitive than ever before. Purposeful doing is now shadowed at every step with the possibilities of distraction. How do we conceive of idleness in this new context? Are we indulging it every time we switch from a work-related document to a quick perusal of emails, or to surf through a few favorite shopping sites? Does distraction eked out in the immediate space of duty count—or is it just a sop thrown to the tyrant stealing most of our good hours?

We are a task oriented people, equipped with lists and planners, goals and objectives, action points and plans.  Productivity is our mantra.  Distraction pour into our work and our plans, but it has not introduced Idleness; it has rather elided work and play, labor and leisure by their convergence upon the devices that are now instruments of both.

But here again we feel the anxiety rising and with it, perversely, the guilt.  Birkerts remedy:

The mind alert but not shunted along a set track, the impulses not pegged to any productivity. The motionless bobber, the hand trailing in the water, the shifting shapes of the clouds overhead. Idleness is the mother of possibility, which is as much as necessity the mother of inventiveness … “In wildness is the preservation of the world,” wrote Thoreau. In idleness, the corollary maxim might run, is the salvaging of the inner life.

The (Un)Naturalness of Privacy

Andrew Keen is not an Internet enthusiast, at least not since the emergence of Web 2.0.  That much has been clear since his 2007 book The Cult of the Amateur: How Today’s Internet is Killing Our Culture and his 2006 essay equating Web 2.0 with Marxism in The Weekly Standard, a publication in which such an equation is less than flattering.  More recently, Keen has taken on all things social in a Wired UK article, “Your Life Torn Open: Sharing is a Trap.”

Now, I’m all for a good critique and lampoon of the excesses of social media, really, but Keen may have allowed his disdain to get the better of him and veered into unwarranted, misanthropic excess.  From the closing paragraph:

Today’s digital social network is a trap. Today’s cult of the social, peddled by an unholy alliance of Silicon Valley entrepreneurs and communitarian idealists, is rooted in a misunderstanding of the human condition. The truth is that we aren’t naturally social beings. Instead, as Vermeer reminds us in The Woman in Blue, human happiness is really about being left alone. …. What if the digital revolution, because of its disregard for the right of individual privacy, becomes a new dark ages? And what if all that is left of individual privacy by the end of the 21st century exists in museums alongside Vermeer’s Woman in Blue? Then what?

“Human happiness is really about being left alone” and “The truth is that we aren’t naturally social beings”? Striking statements, and almost certainly wrong. It seems rather that the vast majority of human beings long for, and sometimes despair of never finding, meaningful and enduring relationships with other human beings. That human flourishing is conditioned on the right balance of the private and the social, individuality and relationship, seems closer to the mark. And while I suppose one could be raised by wolves in legends and stories, I’d like to know how infants would survive biologically in isolation from other human beings.  On this count, better stick with Aristotle.  The family, the clan, the tribe, the city — these are closer to the ‘natural’ units of human existence.

The most ironic aspect of these claims is Keen’s use of Vermeer’s “Woman in Blue” or, more precisely, “Woman in Blue Reading a Letter,” to illustrate them.  That she is reading a letter is germane to the point at issue here which is the naturalness of privacy.  Contrary to Keen’s assertion of the natural primacy of privacy, it is closer to the truth to correlate privacy with literacy, particularly silent reading (which has not always been the norm), and the advent of printing.  Changing socio-economic conditions also factor into the rise of modern notions of privacy and the individual.  Notions formalized by Locke and Hobbes who enshrine the  atomized individual as the foundation of society, notably, with founding myths which are entirely a-historical.

In The Vineyard of the Text, Ivan Illich, citing George Steiner, suggests this mutual complicity of reading and privacy:

According to Steiner, to belong to ‘the age of the book’ meant to own the means of reading.  The book was a domestic object; it was accessible at will for re-reading.  The age presupposed private space and the recognition of the right to periods of silence, as well as the existence of echo-chambers such as journals, academies, or coffee circles.

Likewise, Walter Ong, drawing on Eric Havelock, explains that

By separating the knower from the known, writing makes possible increasingly articulate introspectivity, opening the psyche as never before not only to the external objective world quite distinct from itself but also to the interior self agaisnt whom the objective world is set.

Privacy emerges from the dynamics of literacy.  The more widespread literacy becomes, as for example with the printing press, the more widespread and normalized the modern sense of privacy becomes.  What Keen is bemoaning is the collapse of the experience privacy wrought by print culture.  I do think there is something to mourn there, but to speak of its “naturalness” misconstrues the situation and seems to beget a rather sociopathic view human nature.

Finally, it is also telling that Vermeer’s woman is reading a letter.  Letters, after all, are a social genre; letter writing is a form of social life.  To be sure, it is a very different form of social life than what the social web offers, but it is social.  And were we not social beings we would not, as Auden puts its, “count some days and long for certain letters.” The Woman in Blue Reading a Letter reminds us that privacy is bound to literate culture and human beings are bound to one another.

_____________________________________________________________

Updates:  To reinforce that it is a balance we are after, also consider this article in yesterday’s Boston Globe, “The Power of Lonely.” Excerpt:

But an emerging body of research is suggesting that spending time alone, if done right, can be good for us — that certain tasks and thought processes are best carried out without anyone else around, and that even the most socially motivated among us should regularly be taking time to ourselves if we want to have fully developed personalities, and be capable of focus and creative thinking.

My attention has also been drawn to an upcoming documentary for which Andrew Keen was interviewed.  PressPausePlay will be premiering at South By Southwest and addresses the issues of creativity and art in the digital world.  Here’s  a preview featuring Andrew Keen:

Clever Blog Post Title

Reading about Jon Stewart’s recent D. C. rally, I noticed a picture of a participant wearing a “V for Vendetta” mask and carrying a sign.  It wasn’t the mask, but the words on the sign that caught my eye.  The sign read, “ANGRY SIGN.”

I wouldn’t have paid much attention to it had I not recently crossed paths with a guy wearing a T-shirt that read:  “Words on a T-shirt.”  Both of these then reminded me of the very entertaining, “Academy Award Winning Trailer” — a self-referential spoof of movie trailers embedded at the end of this post.  The “trailer” opens with a character saying, “A toast — establishing me as the wealthy, successful protagonist.”  And on it goes in that vein.  The comments section on the clip’s Youtube page likewise yielded a self-referential parody of typical Youtube comments.  Example:  “Quoting what I think is the funniest part of the video and adding “LOL” at the end of the comment.”

There is a long tradition of self-reference in the art world, Magritte’s pipe is classic example, but now it seems the self-referentiality that was a mark of the avant-garde is an established feature of popular culture.  Call it vulgar self-referentiality if you like.  Our sophistication, in fact, seems to be measured by the degree of our reflexivity and self-awareness — “I know, that I know, that I know, etc.” — which elides with a mood of ironic detachment.   Earnestness in this environment becomes a vice.  We are in a sense “mediating” or re-presenting our own selves through this layer of reflexivity, and we’re pretty sure everyone else is too.

On the surface perhaps, there is a certain affinity with the Socratic injunction, “Know thyself.”  But this is meta-Socrates, “Knowingly know thyself.”  At issue for some is whether there is  a subject there to know that would localize and contain the knowing, or whether in the absence of a subject all there is to know is a process of knowing, knowing itself.

Leaning on Thomas de Zengotita’s Mediated: How the Media Shapes Your World and the Way You Live in It and Hannah Arendt’s The Human Condition, here’s a grossly oversimplified thumbnail genealogy of our heightened self-consciousness.  On the heels of the Copernican revolution, we lost confidence in the ability of our senses to apprehend the world truthfully (my eyes tell me the world is stationary and the sun moves, turns out my eyes lied), but following Descartes we sought certainty in our inner subjective processes — “I think, therefore I am” and all that.  Philosophically then our attention turned to our own thinking — from the object out there, to the subject in here.

Sociologically, modernity has been marked by an erosion of cultural givens; very little attains a taken-for-granted status relative to the pre-modern and early modern world.  In contrast to the medieval peasant, consider how much of your life is not pre-determined by cultural necessity.  Modernity then is marked by the expansion of choice, and choice ultimately foregrounds the act of choosing which yields an attendant lack of certainty in the choice – I am always aware that I could have chosen otherwise.  In other words, identity grounded in the objective (reified) structures of traditional society is superseded by an identity which is the aggregate of the choices we make — choice stands in for fate, nature, providence, and all the rest.  Eventually an awareness of this process throws even the notion of the self into question; I could have chosen otherwise, thus I could be otherwise.  The self, as the story goes, is decentered.  And whether or not that is really the case, it certainly seems to be what we feel to be the case.

So the self-referentialism that marked the avant-garde and the theories of social constructivism that were controversial a generation ago are by now old news.  They are widely accepted by most people under 35 who didn’t pick it all up by visiting the art houses or reading Derrida and company, but by living with and through the material conditions of their society.

First the camera, then the sound recorder, the video recorder, and finally the digitization and consequent democratization of the means of production and distribution (cheap cameras/Youtube, etc.) created a society in which we all know that we know that we are enacting multiple roles and that no action yields a window to the self, if by self we mean some essential, unchanging core of identity.  Foucault’s surveillance society has produced a generation of method actors. In de Zengotita’s words,

Whatever the particulars, to the extent that you are mediated, your personality becomes an extensive and adaptable tool kit of postures . . . You become an elaborate apparatus of evolving shtick that you deploy improvisationally as circumstances warrant.  Because it is all so habitual, and because you are so busy, you can almost forget the underlying reflexivity.

Almost.

There seems to be a tragically hip quality to this kind of hyper-reflexivity, but it is also like looking into a mirror image of a mirror — we get mired in an infinite regress of our own consciousness.  We risk self-referential paralysis, individually and culturally, and we experience a perpetual crisis of identity.  My sense is that a good deal of our cynicism and apathy is also tied into these dynamics.  Not sure where we go from here, but this seems to be where we are.

Or maybe this is all just me.

Life Amid Ruins

In Status Anxiety — his part philosophically-minded self-help book, part social history — Alain de Botton describes two fashions that were popular in the art world during the 17th and 18th century respectively.  The first, vanitas art, took its name from the biblical book of Ecclesiastes in which it is written, “Vanity of vanity, all is vanity.”  Vanitas art which flourished especially in the Netherlands, and also in Paris, was, as the biblical citation implies, concerned with life’s fleeting nature.  As de Botton describes them,

Each still-life featured a table or sideboard on which was arranged a contrasting muddle of objects.  There might be flowers, coins, a guitar or mandolin, chess pieces, a book of verse, a laurel wreath or wine bottle: symbols of frivolity and temporal glory.  And somewhere among these would be set the two great symbols of death and the brevity of life:  a skull and an hourglass.

A bit morbid we might think, but as de Botton explains,

The purpose of such works was not to send their viewers into a depression over the vanity of all things; rather, it was to embolden them to find fault with particular aspects of their own experience, while at the same time attending more closely to the virtues of love, goodness, sincerity, humility and kindness.

Okay, still a bit morbid you might be thinking, but fascinating nonetheless.  Here is the first of two examples provided in Status Anxiety:

Philippe de Champaigne, circa 1671

Here is the second example:

Simon Renard de Saint-Andre, circa 1662

And here are a few others from among the numerous examples one can find online:

Edwart Collier, 1640
Pieter Boel, 1663
Adam Bernaert, circa 1665
Edwart Collier, 1690

Less morbid and more nostalgic, the second art fashion de Botton examines is the 18th and 19th century fascination with ruins.  This fascination was no doubt inspired in part by the unearthing of Pompeii’s sister city, Herculaneum, in 1738.  The most intriguing subset of these paintings of ancient ruins, however, were those paintings that imagined not the past in ruins, but the future.  “A number of artists,” according to de Botton, “have similarly delighted in depicting their own civilization in a tattered future form, as a warning to, and reprisal against, the pompous guardians of the age.”  Consider these the antecedents of the classic Hollywood trope in which some famous city and its monuments lies in ruins — think Planet of the Apes and the Statue of Liberty.

Status Anxiety provides three examples these future ruins.  The first depicts the Louvre in ruins:

Hubert Robert, Imaginary View of the Grande Gallerie of the Louvre in Ruins, 1796

The second depicts the ruins of the Bank of London:

Joseph Gandy, View of the Rotunda of the Bank of England in Ruins, 1798

And the third, from a later period, depicts the city of London in ruins being sketched by a man from New Zealand, “the country that in Dore’s day symbolized the future,” in much the same way that Englishmen on their Grand Tours would sketch the ruins of Athens or Rome.

Gustav Dore, The New Zealander, 1871

Finally, both of these art fashions suggested to my mind Nicolas Poussin’s Et in Arcadia ego from 1637-1638:

Nicolas Poussin, Et in Arcadia ego, 1637-38

Here, shepherds stumble upon some ancient tomb in which they read the inscription, Et in Arcadia ego.  There has been some debate about the precise way the phrase should be taken.  It may be read as the voice of death personified saying “even in Arcadia I exist,” or it may mean “the person buried in this tomb lived in Arcadia.”  In either case the moral is clear.  Death comes for the living.  It is a memento mori, a reminder of death (note the appearance of that phrase in the last piece of vanitas art above).

Admittedly, these are not the most uplifting of reflections.  However, de Botton’s point and the point of the artists who painted these works strikes me as sound:  we make a better go of the present if we live with the kind of perspective engendered by these works of art.  Our tendency to ignore our mortality and our refusal to acknowledge the limitations of a single human life may be animating much of our discontent and alienation.  Perhaps.  Certainly there is some wisdom here we may tap into.  This is pure conjecture, of course, but I wonder how many, having contemplated Gandy’s paiting, would have found the phrase “too big too fail” plausible?  Might we not, with a renewed sense of our mortality, reorder some of our priorities bringing into sharper focus the more meaningful elements of life?

It is also interesting to consider that not only do we have few contemporary equivalents of the kind of art work we’ve considered, but neither do we have any actual ruins in our midst.  America seems uniquely prepared to have been a country without a sense of the past.  Not only are we an infant society by historical standards, but even the ancient inhabitants of our lands, unlike those further south, left no monumental architecture, no tokens of antiquity.  Those of us who live in suburbia may go days without casting our eyes on anything older than twenty years.  We have been a forward looking society whose symbolic currency has not been — could not have been — ruins of the past, but rather the Frontier with its accent on the future and what may be.

I would not call into question the whole of this cultural sensibility, but perhaps we could have used just a few ruins.