Internet Pleasures

Brain science is an endlessly fascinating field.  Each day, it seems,  a new neurological study is published revealing a link between this or that activity and this or that region of the brain, or that a certain neurotransmitter is related to the regulation of a certain behavior, and so on.  Yesterday’s encounter with the wonders of neurology came while listening to David Linden, professor of neuroscience at the Johns Hopkins University School of Medicine, being interviewed on NPR.  Linden’s new book, The Compass of Pleasure: How Our Brains Make Fatty Foods, Orgasm, Exercise, Marijuana, Generosity, Vodka, Learning, and Gambling Feel So Good, as the inelegant subtitle more than suggests, explores the role of the brain in the experience of pleasure.

Much of the interview focuses on a discussion of the neurology of addiction leading Linden to warn,

“Any one of us could be an addict at any time,” Linden says. “Addiction is not fundamentally a moral failing — it’s not a disease of weak-willed losers. When you look at the biology, the only model of addiction that makes sense is a disease-based model, and the only attitude towards addicts that makes sense is one of compassion.”

Initially, I was struck by two considerations after listening to the interview, both relating to the practical consequences of the science Linden discussed.  First, how oddly Aristotelian all the practical considerations come out sounding:  virtues and vices, habits, and moderation.   Secondly, how little difference this knowledge made for Linden in his own lived experience.  Here is the very last exchange from the interview:

NPR: Since you have studied pleasure and the pleasure circuitry of the brain, has that affected your own relationship with pleasure and the things that you seek or try not to get pleasure from?

Linden: Well, I try deeply not to let it do that.  I certainly — when I’m enjoying a glass of wine I don’t want to be thinking about dopamine levels and, for the most part, fortunately I have been able to avoid doing that. I’m blessed with not having a particularly addictive personality — although I’m a bit of a hedonist — so it hasn’t actually made t0o much of an impact on my own life.

This is a rather jarring note on which to wrap up the interview.  I’ve ordinarily been one to subscribe to A.E. Housman’s line, “All Human Knowledge is precious whether or not it serves the slightest human use.”  And mostly, I would still want to defend something like that claim.  Yet, there is something peculiar about our coming to know more about the biological and neurological base of human life, purportedly the real stuff on which all human life and action rests, only to find that for an ordinary, healthy adult steeped in this knowledge, it makes not much of a difference at all, and, in fact, that he consciously tries to disassociate his knowledge from his experience.  This bears more reflection, but there was one final thought, more directly related to the usual themes on this blog that I wanted to note.

Understanding the Internet’s personal and social consequences involves venturing into the territory mapped out by Linden and others in his field.  Pleasure of some sort — whether benign,  problematic, or illicit — is involved in our daily interactions with the Internet.  If there is a certain compulsiveness to our online experience, then it is because our internet experience shares in an economy of desire, pleasure, and cycles of stimulation and diminishing return that potentially lead  to addictive behavior.

We know that society tolerates certain addictive behaviors more than others, sometimes in seemingly arbitrary fashion.  Internet addiction may carry only a slight social stigma if any at all;  one is tempted rather to conclude that it carries a certain social cachet.  Whether socially acceptable or not, compulsive (or addictive, take your pick) Internet use does appear to have certifiably negative physical consequences in the brain.  A study just published in PLoS ONE suggests that heavy Internet use, particularly online gaming, leads to significant alterations in brain structure with detrimental consequences for cognitive function.  You can read more about the study here, here, and here.

Perhaps not surprisingly, the first of those three articles concludes its report with an appeal to the ancient Roman writer Petronius: “Moderation in all things, including moderation.”  I’m not sure if the writer meant to endorse Petronius’ playful, perhaps satirical tone; more likely it was intended as a straightforward prescription of moderation.

Marx, Freud, and … McLuhan

Just wanted to pass along Jeet Heer’s piece, “Divine Inspiration” in The Walrus, on Marshall McLuhan, his legacy, and his Catholicism.  Excerpts below.  Click through for the whole piece which is not long at all.

  • It’s a measure of McLuhan’s ability to recalibrate the intellectual universe that in this debate, [Norman] Mailer — a Charlie Sheen–style roughneck with a history of substance abuse, domestic violence, and public mental breakdowns — comes across as the voice of sobriety and sweet reason. Mailer once observed that McLuhan “had the fastest brain of anyone I have ever met, and I never knew whether what he was saying was profound or garbage.”
  • Indeed, his faith made him a more ambitious and far-reaching thinker. Belonging to a Church that gloried in cathedrals and stained glass windows made him responsive to the visual environment, and liberated him from the textual prison inhabited by most intellectuals of his era. The global reach and ancient lineage of the Church encouraged him to frame his theories as broadly as possible, to encompass the whole of human history and the fate of the planet. The Church had suffered a grievous blow in the Gutenberg era, with the rise of printed Bibles leading to the Protestant Reformation. This perhaps explains McLuhan’s interest in technology as a shaper of history. More deeply, the security he felt in the promise of redemption allowed him to look unflinchingly at trends others were too timid to notice.
  • Like Marx and Freud, he was an intellectual agitator, a conceptual mind expander, the yeast in the dough. After Marx, we can no longer ignore the reality of class difference; after Freud, we can’t pretend that our mental life isn’t saturated with sexual impulses; after McLuhan, we can’t imagine that technology is just a neutral tool. Moreover, like Darwin and Marx, McLuhan is no longer just one man but rather a living and evolving body of thought.

A few months ago I posted a link to a YouTube clip of the Mailer/McLuhan debate here, and here is a piece on Chesterton’s influence on McLuhan.

Incidentally, while pairing McLuhan with the likes of Marx, Darwin, and Freud is in some respects incongruous, what they do have in common is an awareness, sometimes overplayed, of the external forces shaping and influencing human thought  and personality.  What may set McLuhan apart on this score is his unwillingness to slide into determinism:

“There is absolutely no inevitability as long as there is a willingness to contemplate what is happening.”  — Marshall McLuhan, The Medium is the Massage

The Unsettledness at the Heart of our Experience

Unsettled — I’m beginning to think that is a helpful word to capture what it feels like to be alive at present.

[Okay, fair warning, what follows is more speculative and exploratory than what I usually feel comfortable writing on here.  Thoughts and criticism welcome.]

Unsettled is usually used in conversation to mean something like troubled or worried or disconcerted.  More literally it suggests being unanchored, untethered, without grounding, deracinated, adrift, without center.  To view it another way, it is to speak of alienation.

Is it legitimate to speak of alienation in the context of ubiquitous social networks and communication?  Might it be that our connectedness veils a deeper alienation that bubbles up to the surface of consciousness as a pervasive unsettledness?  This is my hypothesis for the moment.

We have known for a long time that as moderns we are no longer connected to place in any significant sense.  Mobility and the autonomy that it purchases come at a cost.  We hardly expect to die in the place we were born.  Most of us will move many times, from city to city, or state to state, or even country to country, before we finally move to Florida or Arizona.  Each move uproots us.  With each move we start over again to some degree.  Many of us are hard pressed to name our home in any traditional sense, so home is simply where we happen to be.  We are, then, spatially or geographically unsettled.

Is there a sense in which we are also temporally unsettled?  Is there an alienation at the heart of our experience of time as well as place?  Here I am thinking again of our mediated experience of the present.  Consider what we might call simply lived experience as a kind of baseline.  Life carried on with a certain immediacy, life lived as a subject interacting with the world beyond our skin.  Now consider what I’m going to call, perhaps problematically*, mediated experience.  This is life lived with a view to its own (re)presentation, life as conscious performance — for the camera, for Facebook, for our blog, etc.  At such times it seems we have inserted a layer of mediation between the present and our experience of it.  If so, might we then speak of a temporal alienation, a temporal unsettledness? Are we not only untethered from place, but also from time?

When we experience life with a view to its future presentation, with what Nathan Jurgenson has aptly called “documentary vision”, we are no longer in the moment as subject.  We are, so to speak, no longer acting in our own life, we are directing; we have become spectators of our own lives.  In a sense we have objectified ourselves; we are looking at our selves. In my memories of events, I often see only the image of pictures I am in.  The memory is not my own first person memory, it is an image that stands in for my own lived experience of the event in which I am an object and not the subject — perhaps because I was not, properly speaking, experiencing the event as a lived experience.

If there is, in fact, a vague unsettled quality to our experience, perhaps it is because we have managed to uproot ourselves not only from place and the stability it brings, but also from the flow of time, from the lived present, in such a way that there is something like an oddly disjointed quality to our sense of self — as if we were watching a film with a time lag between the image and the sound.

While not exactly what T. S. Eliot had in mind, we might say that this begins to answer his poetic query, “Where is the Life we have lost in living?”

_______________________________________________________________________

* I say “problematically” because at some level, in some sense all experience is mediated even if only by our own use of language in our minds.

“Its beauty puts to shame all our doubts”

Stanisław Masłowski, Moonrise, 1884

“The whole world stops as this stunning dancer rises,” Alessandro said, “and its beauty puts to shame all our doubts.”

As Alessandro, the protagonist in Mark Helprin’s A Soldier of the Great War, prepares to leave for university, his father tells him, “You’ll learn more in your journeys to and from Bologna, if you make them on horseback, than from all your professors combined.”  Alessandro’s narratorial voice adds, “he had almost been right.”

A Soldier of the Great War is the tale of an Italian veteran of the First World War who recounts his life story years later during a long walk with a young man he meets by chance.  It is, among other things, a book about beauty and the kind of attention to the world necessary to recognize it.  Alessandro believes in the redemptive power of beauty and throughout the story he shows himself to be remarkably attuned to the instances of beauty that permeate our experience.  Not only the beauty of a majestic moonrise, but also the beauty in more prosaic scenes.

In her absence, and in the absence of anyone like her, he was drawn to  many things that, in being beautiful, were her allies — the blue of the stage-set in the floodlights, the grace of a cat as it turned its small lion-like face to question a human movement, a fire that blazed from within the dark of a blacksmith’s shop or a baker’s and caught his eye as he passed, a single tone arising from a cathedral choir to shock a jaded congregation with it unworldly beauty, the mountaintops as snow was lashed from them by blue winds, the perfect and uncontrived smile of a child.

In his Kenyon College commencement address from 2005, David Foster Wallace, with the kind of earnestness that he was uniquely capable of pulling off, similarly insisted that

The really important kind of freedom involves attention, and awareness, and discipline, and effort, and being able truly to care about other people and to sacrifice for them, over and over, in myriad petty little unsexy ways, every day. That is real freedom. The alternative is unconsciousness, the default-setting, the “rat race” — the constant gnawing sense of having had and lost some infinite thing.

Attention again. Attention to beauty, attention in order to love well.  My worry is that the habits we form in a wired, connected, networked, always online, linked in world combat the sort of attention that Alessandro practices as well as the kind of attention that Wallace advocated. Nothing captures this more than the posture we are all so adept at striking now: head down, focused on a small screen, with the world going by all around us — unnoticed, unattended.

The devices themselves don’t demand this, and there are ways of using them so that they do not become the enemies of attention. Nonetheless, there does seem to be a propensity toward uses and practices that form habits of misdirected  and fractured attention.

Helprin and Wallace, each in their own way, push us to look up and take notice; to come up out of the digital waters for breath and for beauty and for love.  To see, to really see the world around us and to get out of our heads long enough to be attentive to others — that is our challenge.

Place and Image, Death and Memory

The sociability of social networking sites such as Facebook is built upon an archive of memories.  Facebook trades in memory in at least two ways.  On the one hand, and perhaps especially for older users, Facebook is platform that facilitates the search for memories.  Old friends and old flames can be found on Facebook.  Reconnecting with a high school buddy activates a surge of interconnected memories that lead to other long forgotten memories and so on.  On the other hand, and this perhaps especially for younger users, Facebook also renders present experience already a depository of potential memories.  The future past impinges upon the present.  Our experience is a conducted as a search for memories yet to be formed which will be archived on Facebook.  In a sense then we hunt for memories past and, paradoxically, for memories future.

This post is the second in a series situating Facebook within the memory theater/arts of memory tradition.  In the first post I set the stage by describing how social networking sites and internet enabled smart phones have constituted experience as a field of potential memories.  I also suggested that how we store and access our memories makes a difference.  The cultural and personal significance of memory is not a static category of human nature. Memory and its significance evolve over time, often in response to changing technologies.  So the question, then, is something like this:  What difference does it make, personally and culturally, that Facebook has become such a prominent mode of memory? 

In order to explore that question, I’ll delve briefly into the history of the art of memory, a set of memory practices with which, I believe, Facebook shares interesting similarities.  But as promised at the end of the last post, we’ll start with a story.

Spatiality, images, and death have long been woven together in the complex history of remembering.  Each appears prominently in the founding myth of what Frances Yates has called the “art of memory” as recounted by Cicero in his De oratore. According to the story, the poet Simonides of Ceos was contracted by Scopas, a Thessalian nobleman, to compose a poem in his honor.  To the nobleman’s chagrin, Simonides devoted half of his oration to the praise of the gods Castor and Pollux.  Feeling himself cheated out of half of the honor, Scopas brusquely paid Simonides only half the agreed upon fee and told him to seek the rest from the twin gods.  Not long afterward that same evening, Simonides was summoned from the banqueting table by news that two young men were calling for him at the door.  Simonides sought the two callers, but found no one.  While he was out of the house, however, the roof caved in killing all of those gathered around the table including Scopas. As Yates puts it, “The invisible callers, Castor and Pollux, had handsomely paid for their share in the panegyric by drawing Simonides away from the banquet just before the crash.”

Cicero

The bodies of the victims were so disfigured by the manner of death that they were rendered unidentifiable even by family and friends.  Simonides, however, found that he was able to recall where each person was seated around the table and in this way he identified each body.  This led Simonides to the realization that place and image were the keys to memory, and in this case, also a means of preserving identity through the calamity of death.  In Cicero’s words,

[Simonides] inferred that persons desiring to train [their memory] must select places and form mental images of the things they wish to remember and store those images in the places, so that the order of the places will preserve the order of the things, and the images of the things will denote the things themselves, and we shall employ the places and images respectively as a wax writing-tablet and the letters written on it.

Cicero is one of three classical sources on the principles of artificial memory that evolved in the ancient world as a component of rhetorical training.  The other two sources are Quintilian’s Institutio oratoria and the anonymous Ad Herennium.  It is through the Ad Herennium, mistakenly attributed to Cicero, that the art of memory migrates into Medieval culture where it is eventually assimilated into the field of ethics.  Cicero’s allusion to the wax-writing table, however, reminds us that discussion of memory in the ancient world was not limited to the rhetorical schools.  Memory as a block of wax upon which we make impressions is a metaphor attributed to Socrates in Plato’s Theaetetus where it appears as a gift of Mnemosyne, the mother of the muses:

Imagine, then, for the sake of argument, that our minds contain a block of wax, which in this or that individual may be larger or smaller, and composed of wax that is comparatively pure or muddy, and harder in some, softer in others, and sometimes of just the right consistency.

Let us call it the gift of the Muses’ mother, Memory, and say that whenever we wish to remember something we see or hear or conceive in our own minds, we hold this wax under the perceptions or ideas and imprint them on it as we might stamp the impressions of a seal ring.  Whatever is so imprinted we remember and know so long as the image remains; whatever is rubbed out or has not succeeded in leaving an impression we have forgotten and do not know.

Plato and Aristotle in Rafeal's "School of Athens"

The Platonic understanding of memory was grounded in a metaphysic and epistemology which located the ability to apprehend truth in an act of recollection.  Plato believed that the highest forms of knowledge were not derived from sense experience, but were first apprehended by the soul in a pre-existent state and remain imprinted deep in a person’s memory.  Truth consists in matching the sensible experience of physical reality to the imprint of eternal Forms or Ideas whose images or imprints reside in memory.  Consequently the chief aim of education is the remembering of these Ideas and this aim is principally attained through “dialectical enquiry,” a process, modeled by Plato’s dialogs, by which a student may arrive at a remembering of the Ideas.

At this point, we should notice that the anteriority, or “pastness,” of the knowledge in question is, strictly speaking, incidental.  What is important is the presence of the absent Idea or Form.  It is to evoke the presence of this absence that remembering is deployed.  It is the presence of eternal Ideas that secures the apprehension of truth, goodness, or beauty in the present.  Locating the memory within the span of time past does not bear upon its value which rests in its being possessed as a model against which to measure experience.

Paul Ricoeur, in Memory, History, Forgetting, begins his consideration of the heritage of Greek reflections on memory with the following observation:

Socratic philosophy bequeathed to us two rival and complementary topoi on this subject, one Platonic, the other Aristotelian.  The first, centered on the theme of the eikōn [image], speaks of the present representation of an absent thing; it argues implicitly for enclosing the problematic of memory within that of imagination.  The second, centered on the theme of the representation of a thing formerly perceived, acquired, or learned, argues for including the problematic of the image within that of remembering.

As he goes on to note, from these two framings of the problematic of memory “we can never completely extricate ourselves.”

Reflecting for just a moment on the nature of our own memories it is not difficult to see why this might be the case.  If we remember our mother, for example, we may do so either by contemplating some idealized image of her in our mind’s eye or else by recollecting a moment from our shared past.  In both cases we may be said to be remembering our mother, but the memories differ along the Platonic/Aristotelian divide suggested by Ricoeur.  In the former case I remember her in a way that seeks her presence without reference to time past; in the latter, I remember her in a way that situates her chronologically in the past.

At this point, I’m sure it seems that we’ve wandered a bit from the art of memory and father still from social networking sites.  There is a method to this madness, however, but demonstrating that will have to wait for the next post.  Already, I am pushing the limits of acceptable blog post length.

Looking forward to the next post in this series, then, here are the tasks that remain:

  • Exploring memory as an index of desire.
  • Setting the art of memory tradition, and Facebook, within Ricoeur’s schema.
  • Asking what difference all of this makes.