Arts of Memory 2.0: The Sherlock Update

I’m a little behind the times, I know, but I just finished watching the second episode in the second season of Sherlock, “The Hounds of Baskerville.” If you’ve watched the episode, you’ll remember a scene in which Sherlock asks to be left alone so that he may enter his “Mind Palace.” (If you’ve not been watching Sherlock, you really ought to.) The “Mind Palace” in question turns out to be what has traditionally been called a memory palace or memory theater. It is the mental construct at the heart of the ancient ars memoria, or arts of memory.

Longtime (and long-suffering) readers will remember a handful of posts discussing this ancient art and also likening Facebook to something like a materialized memory palace (here, here, and here). To sum up:

“… the basic idea is that one constructs an imagined space in the mind (similar to the work of the Architect in the film Inception, only you’re awake) and then populates the space with images that stand in for certain ideas, people, words, or whatever else you want to remember.  The theory is that we remember images and places better than we do abstract ideas or concepts.”

And here is the story of origins:

“… the founding myth of what Frances Yates has called the “art of memory” as recounted by Cicero in his De oratore. According to the story, the poet Simonides of Ceos was contracted by Scopas, a Thessalian nobleman, to compose a poem in his honor.  To the nobleman’s chagrin, Simonides devoted half of his oration to the praise of the gods Castor and Pollux.  Feeling himself cheated out of half of the honor, Scopas brusquely paid Simonides only half the agreed upon fee and told him to seek the rest from the twin gods.  Not long afterward that same evening, Simonides was summoned from the banqueting table by news that two young men were calling for him at the door.  Simonides sought the two callers, but found no one.  While he was out of the house, however, the roof caved in killing all of those gathered around the table including Scopas. As Yates puts it, “The invisible callers, Castor and Pollux, had handsomely paid for their share in the panegyric by drawing Simonides away from the banquet just before the crash.

The bodies of the victims were so disfigured by the manner of death that they were rendered unidentifiable even by family and friends.  Simonides, however, found that he was able to recall where each person was seated around the table and in this way he identified each body.  This led Simonides to the realization that place and image were the keys to memory, and in this case, also a means of preserving identity through the calamity of death.”

The most interesting thing about the manner in which Sherlock presents the memory palace is that it has been conceived on the model of something like a touchscreen interface. You can watch the clip below to see what I mean. In explaining what Holmes is doing to a third party, Watson describes something like a traditional memory palace (not in the video clip). But what we see him doing is quite different. Rather than mentally walking through an architectural space, Holmes swipes at images (visualized for the audience) organized into something like an alphabetic multi-media database.

Surprisingly, though, this stripped down structure does have a precedent in the medieval practice of the arts of memory. Ivan Illich describes what 12th century scholar Hugh of St. Victor required of his pupils:

“… Hugh asks his pupils to acquire an imaginary inner space … and tells them how to proceed in its construction. He asks the pupil to imagine a  sequence of whole numbers, to step on the originating point of their run and let the row reach the horizon. Once these highways are well impressed upon the fantasy of the child, the exercise consists in mentally ‘visiting’ these numbers at random. In his imagination the student is to dart back and forth to each of the spots he has marked by a roman numeral.”

This flat and bare schematic was the foundation for more elaborate, three dimensional memory palaces to be built later.

The update to the memory theater is certainly not out of keeping with the spirit of the tradition which always looked to familiar spaces as a model. What more familiar space can we conceive of these days than the architecture of our databases. Thought experiment: Visualize your Facebook page. Can you do it? Can you scroll through it? Can you mentally click and visualize new pages? Can you scroll through your friends? Might you even be able to mentally scroll through your pictures? Well, there you have it; you have a memory palace and you didn’t even know it.

The Simple Life in the Digital Age

America has always been a land of contradictions. At the very least we could say the nation’s history has featured the sometimes creative, sometimes destructive interplay of certain tensions. At least one of these tensions can be traced right back to the earliest European settlers. In New England, Puritans established a “city on a hill,” a community ordered around the realization of a spiritual ideal.  Further south came adventurers, hustlers, and entrepreneurs looking to make their fortune. God and gold, to borrow the title of Walter R. Mead’s account of the Anglo-American contribution to the formation of the modern world, sums it up nicely.  Of course, this is also a rather ancient opposition. But perhaps we could say that never before had these two strands come together in quite the same way to form the double helix of a nation’s DNA.

This tension between spirituality and materialism also overlaps with at least two other tensions that have characterized American culture from its earliest days: The first of these, the tension between communitarianism and individualism, is easy to name. The other, though readily discernible, is a little harder to capture. For now I’m going to label this pair hustle and contemplation and hope that it conveys the dynamic well enough. Think Babbitt and Thoreau.

These pairs simplify a great deal of complexity, and of course they are merely abstractions. In reality, the oppositions are interwoven and mutually dependent. But thus qualified, they nonetheless point to recurring and influential types within American culture. These types, however, have not been balanced and equal. There has always seemed to be a dominant partner in each pairing: materialism, individualism, and hustle. But it would be a mistake to underestimate the influence of spirituality, communitarianism, and contemplation. Perhaps it is best to view them as the counterpoint to the main theme of American culture, together creating the harmony of the whole.

One way of nicely summing up all that is entailed by the counterpoints is to call it the pursuit of the simple life. The phrase sounds quaint, but it worked remarkably well in the hands of historian David E. Shi. In 1985, right in the middle of the decade that was to become synonymous with crass materialism – the same year Madonna released “Material Girl” – Shi published The Simple Life: Plain Living And High Thinking In American Culture. The audacity!

Shi weaves a variegated tapestry of individuals and groups that have advocated the simple life in one form or another throughout American history. Even though he purposely leaves out the Amish, Mennonites, and similar communities, he still is left with a long and diverse list of practitioners. Altogether they represent a wide array of motives animating the quest for the simple life. These include: “a hostility toward luxury and a suspicion of riches, a reverence for nature and a preference for rural over urban ways of life and work, a desire for personal self-reliance through frugality and diligence, a nostalgia for the past and a scepticism toward the claims of modernity, conscientious rather than conspicuous consumption, and an aesthetic taste for the plain and functional.”

This net gathers together Puritans and Quakers, Jeffersonians and Transcendentalists, Agrarians and Hippies, and many more. Perhaps if Shi were to update his work he might include hipsters in the mix. In any case, he would have no shortage of contemporary trends and movements to choose from. None of them dominant, of course, but recognizable and significant counterpoints still.

If I were tasked with updating Shi’s book, for example, I would certainly include a chapter on the critics of the digital age. Not all such critics would fit neatly into the simple life tradition, but I do think a good many would – particularly those who are concerned that the pace and rhythm of digitally augmented life crowds out solitude, silence, and reflection. Think, for example, of the many “slow” movements and advocates (myself included) of digital sabbaths. They would comfortably take their place alongside a many of the individuals and movements in Shi’s account who have taken the personal and social consequences of technological advance as their foil. Thoreau is only the most famous example.

Setting present day critics of digital life in the tradition identified by Shi has a few advantages. For one thing, it reminds us that the challenges posed by digital technologies, while having their particularities, are not entirely novel in character. Long before the dawn of the digital age, individuals struggled to find the right balance between their ideals for the good life and the possibilities and demands created by the emergence of new technologies.

Moreover, we may readily and fruitfully apply some of Shi’s conclusions about the simple life tradition to the contemporary criticisms of life in the digital age.

First, the simple life has always been a minority ethic. “Many Americans have not wanted to lead simple lives,” Shi observes, “and not wanting to is the best reason for not doing so.” But, in his view, this does not diminish the salutary leavening effect of the few on the culture at large.

Yet , Shi concedes, “Proponents of the simple life have frequently been overly nostalgic about the quality of life in olden times, narrowly anti-urban in outlook , and too disdainful of the benefits of prosperity and technology.” Better to embrace the wisdom of Lewis Mumford, “one of the sanest of all the simplifiers” in Shi’s estimation. According to Mumford,

“It is not enough to say, as Rousseau once did, that one has only to reverse all current practice to be right … If our new philosophy is well-grounded we shall not merely react against the ‘air-conditioned nightmare’ of our present culture; we shall also carry into the future many elements of quality that this culture actually embraces.”

Sound advice indeed.

If we are tempted to dismiss the critics for their inconsistencies, however, Shi would have us think again: “When sceptics have had their say, the fact remains that there have been many who have demonstrated that enlightened self-restraint can provide a sensible approach to living that can be fruitfully applied in any era.”

But it is important to remember that the simple life at its best, now as ever, requires a person “willing it for themselves.” Impositions of the simple life will not do. In fact, they are often counterproductive and even destructive. That said, I would add, though Shi does not make this point in his conclusion, that the simple life is perhaps best sustained within a community of practice.

Wisely, Shi also observes, “Simplicity is more aesthetic than ascetic in its approach to good living.” Consequently, it is difficult to lay down precise guidelines for the simple life, digital or otherwise. Moderation takes many forms. And so individuals must deliberately order their priorities “so as to distinguish between the necessary and superfluous, useful and wasteful, beautiful and vulgar,” but no one such ordering will be universally applicable.

Finally, Shi’s hopeful reading of the possibilities offered by the pursuit of the simple life remains resonant:

“And for those with the will to believe in the possibility of the simple life and act accordingly, the rewards can be great. Practitioners can gradually wrest control of their own lives from the manipulative demands of the marketplace and the workplace … Properly interpreted, such a modern simple life informed by its historical tradition can be both socially constructive and personally gratifying.”

Nathan Jurgenson has recently noted that criticisms of digital technologies are often built upon false dichotomies and a lack of historical perspective. In this respect they are no different than criticisms advanced by advocates of the simple life who were also tempted by similar errors. Ultimately, this will not do. Our thinking needs to be well-informed and clear-sighted, and the historical context Shi provides certainly moves us toward that end. At the very least, it reminds us that the quest for simplicity in the digital age had its analog precursors from which we stand to learn a few things.

Robotic Zeitgeist

Robotics and AI are in the air. A sampling:

“Bot with boyish personality wins biggest Turing test”: “Eugene Goostman, a chatbot with the personality of a 13-year-old boy, won the biggest Turing test ever staged, on 23 June, the 100th anniversary of the birth of Alan Turing.”

“Time To Apply The First Law Of Robotics To Our Smartphones”: “We imagined that robots would be designed so that they could never hurt a human being. These robots have no such commitments. These robots hurt us every day.”

“Robot Hand Beats You at Rock, Paper, Scissors 100% Of The Time”: “This robot hand will play a game of rock, paper, scissors with you. Sounds like fun, right? Not so much, because this particular robot wins every. Single. Time.”

Next, two on the same story coming out of Google’s research division:

“I See Cats”: “Google researchers connected 16,000 computer cores together into a huge neural net (like the network of neurons in your brain) and then used a software program to ask what it (the neural net) “saw” in a pool of 1 million pictures downloaded randomly from the internet.”

“The Triumph of Artificial Intelligence! 16,000 Processors Can Identify a Cat in a YouTube Video Sometimes”: “Perhaps this is not precisely what Turing had in mind.”

Much of this talk about AI has coincided with what would have been Turing’s 100th birthday. Most of it has celebrated the brilliant mathematician and lamented the tragic nature of his life and death. This next piece, however, takes a critical look at the course of AI (or better, the ideology of AI) since Turing:

“The Trouble with the Turing Test”: “But these are not our only alternatives; there is a third way, the way of agnosticism, which means accepting the fact that we have not yet achieved artificial intelligence, and have no idea if we ever will.”

And on a slightly different, post-humanist note (via Evan Selinger):

The International Journal of Machine Consciousness has devoted an entire issue to “Mind Uploading.”

There you go; enough to keep you thinking today.

Technology and Perception: That By Which We See Remains Unseen

“Looking along the beam, and looking at the beam are very different experiences.”
— C. S. Lewis

I wrote recently about the manner in which ubiquitous realities tend to fade from view. They are, paradoxically, too pervasive to be noticed. And I suggested (although, of course, this was nothing like an original observation) that it is these very realities, hiding in front of our noses as the cliché has it, which most profoundly shape our experience. I made note of this phenomenon in order to say that very often these ever-present, unnoticed realities are technological realities.

I want to return to these thoughts and, with a little help from Maurice Merleau-Ponty, unpack at least one of the ways in which certain technologies fade from view while simultaneously shaping our perception. In doing so I’ll also draw on a helpful article by Philip Brey, “Technology and Embodiment in Ihde and Merleau-Ponty.”

The purpose of Brey’s article is to supplement and shore up certain categories developed by the philosopher of technology, Don Ihde. To do so, Brey traces certain illustrations used by Ihde back to their source in Merleau-Ponty’s Phenomenology of Perception.

Ihde sought to create a taxonomy that categorized a limited set of ways humans interacted with technology, and among his categories was one he termed “embodiment relations.” Ihde defined embodiment relations as those in which a technology mediates an individual’s perception of the world and gives a series of examples including glasses, telescopes, hearing aids, and a blind man’s cane. An interesting feature of each of these technologies is that they “withdraw” from view when their use becomes habitual. Ihde lists other examples, however, which Brey finds problematic as exemplars of the category. These include the hammer and a feathered hat.

(The example of the feather hat is drawn from Merleau-Ponty. As a lady wearing a feathered hat makes her way about, she interacts with her surroundings in light of this feature that amounts to an extension of her body.)

In both cases, Brey believes the example is less about perception (although it can be involved) and more about action. Consequently, Brey offers some further distinctions to better get at the kinds of relations Ihde was attempting to classify. He begins by dividing embodiment relations into relations that mediate perception and those that mediate motor skills.

Brey goes on to make further distinctions among the kinds of embodiment relations that mediate motor skills. Some of these involve navigational skills and tend to be of the sort that “enlarge” one’s body. The feathered hat fits into this category as do other items such as a worn backpack that require the user to incorporate the object into one’s body schema in such a way that we pre-consciously navigate as if the object were a part of our body. Then there are embodiment relations which mediate motor skills in interaction with the environment. The hammer fits into this category. These objects become part of our body schema in order to extend our action in the world.

These clarifications and distinctions are helpful, and Brey is right to distinguish between embodiment relations geared toward perception and those geared toward action. But he is also right to point out that even those tools that are geared toward action involve perception to some degree. While a hammer is used primarily to mediate action, it also mediates perception. For example, a hammer strike reveals something about the surface struck.

Yet Brey believes that in this class of embodiment relations the perceptual function is “subordinate” to the motor function. This is probably a sound conclusion, but it does not seem to take into account a more subtle way in which perception comes into play. Elsewhere, I’ve written about the manner in which technology in-hand affects our perception of the world not only by offering sensory feedback, but also by shaping our interpretive acts of perception, our seeing-as. I won’t rehash that argument here; instead I want to focus on the withdrawing character of technologies through which we perceive.

The sorts of tools that mediate perception ordinarily do so while they themselves recede from view. Summarizing Ihde’s discussion of embodiment relations, Brey offers the following description of the phenomenon:

“In embodiment relations, the embodied technology does not, or hardly, become itself an object of perception. Rather, it ‘withdraws’ and serves as a (partially) transparent means through which one perceives one’s environment, thus engendering a partial symbiosis of oneself and it.”

Consider the eye as a paradigmatic example. We see all things through it, but we never see it (unless, of course, in a mirror). This is a function of what Michael Polanyi has called the “from-to” character of perception. Our intentionality is directed from our body outward to the world. “The bodily processes hide,” Mark Johnson explains, “in order to make possible our fluid, automatic experiencing of the world.”

The technologies that we take into an embodied relation do not ordinarily achieve quite so complete a withdrawal, but they do ordinarily fade from our awareness as objects in themselves. Contact lenses, for example, or the blind man’s cane. In fact, almost any tool of which we become expert users tends to withdraw as an object in its own right in order to facilitate our perception or our action.

In short essay titled “Meditation in a Toolshed,” C. S. Lewis offeres an excellent illustration of this dynamic. Granted, he was offering an illustration of different phenomenon, but I think it fits here as well. Lewis described entering into a dark toolshed and seeing before him a shaft of light coming in through a crack above the door. At that moment Lewis “was seeing the beam, not seeing things by it.” But then he stepped into the beam:

“Instantly the whole previous picture vanished. I saw no toolshed, and (above all) no beam. Instead I saw, framed in the irregular cranny at the top of the door, green leaves moving on the branches of a tree outside and beyond that, 90 odd million miles away, the sun. Looking along the beam, and looking at the beam are very different experiences.”

Notice his emphasis on the manner in which the beam itself disappears from view when one sees through it. That through which we perceive ceases to be an object that we perceive. Returning to where we began then, we might say that one manner in which a technology becomes too pervasive too be noticed is by becoming that by which we perceive the world or some aspect of it.

It is easiest to recognize the dynamic at work in objects that are specifically designed to enhance our physical senses — eyeglasses, for example. But the principle may be expanded further (even if the mechanics shift) to include other less obvious ways we perceive through technology. The whole of Marshall McLuhan’s work, in fact, could be seen as an attempt to understand how all technology is media technology that alters perception. In other words, all technology mediates reality in some fashion, but the mediating function withdraws from view because it is that through which we perceive the content. It is the beam of light into which we step to perceive some other thing and, as with the beam, it remains unseen even while it enables and shapes our seeing.


Tip the Writer

$1.00

Incommensurable Losses

Every so often I pop in my old audio tapes of Jacques Barzun’s From Dawn to Decadence and, at the expense of sounding hopelessly nostalgic, it leaves me thinking that scholarship is not what it used to be. It reminds me of a few lines from Edward Said that I first came across via Alan Jacobs some time ago, although I no longer remember where exactly. Said, thinking of the humanistic scholars of the mid-twentieth century wrote:

“This is not to say that we should return to traditional philological and scholarly approaches to literature. No one is really educated to do that honestly anymore, for if you use Erich Auerbach and Leo Spitzer as your models you had better be familiar with eight or nine languages and most of the literatures written in them, as well as archival, editorial, semantic, and stylistic skills that disappeared in Europe at least two generations ago.”

In any case, this meandering all leads to a passage from Barzun with which I will leave you. It is a moving estimation of the losses occasioned by the First World War, a war the consequences of which I tend to think we underestimate. Here is Barzun:

“Varying estimates have been made of the losses that must be credited to the great illusion. Some say 10 million lives were snuffed out in the 52 months and double that number wounded. Others propose higher or lower figures. The exercise is pointless, because loss is a far wider category than death alone. The maimed, the tubercular, the incurables, the shell-shocked, the sorrowing, the driven mad, the suicides, the broken spirits, the destroyed careers, the budding geniuses plowed under, the missing births were losses, and they are incommensurable … One cannot pour all human and material resources into a fiery cauldron year after year and expect to resume normal life at the end of the prodigal enterprise.”

And so, to varying degrees, it must be with any war.