For Your Consideration

In the recent past, I might have been tempted to write a blog post about these. As things stand, I’ll merely point you to them.

“Valley of God” in the Financial Times — Faith in Silicon Valley. In to be taken in both senses. File under religion of technology.

“He stopped going to church. Instead, he went to the computer – “there was this thing called Google” – and started researching theories of evolution to recast his understanding of the world. After the terrorist attacks of September 11 2001, he discovered the potential to organise political activists on the internet. And when he got sick again, he credited the internet with saving his life. He replaced his faith in the Christian God of his childhood with faith in technology.”

“Making the Land Our Own” in American Scientist — A review of American Georgics: Writings on Farming, Culture, and the Land. Opening:

“Forty-eight years ago in his groundbreaking book, The Machine in the Garden: Technology and the Pastoral Ideal in America, historian Leo Marx cited Thomas Jefferson to illuminate the tension between farming and industry that has characterized land use in the United States for more than two centuries.”

“Computer Literacy and the Cybernetic Dream” — Short lecture by Ivan Illich delivered in 1987. Interesting throughout.

“With great pains she has trained her inner Descartes and her inner Pascal to watch each other: to balance mind and body, spirit and flesh, logic and feeling.”

“When I think of the glazing which the screen brings out in the eyes of its user, my entrails rebel when somebody says that screen and eye are ‘facing’ each other.”

Pattern Recognition: The Genius of our Time?

What counts for genius in our times?  Is it the same as what has always counted for genius?  Or, are there shifting criteria that reflect the priorities and affordances of a particular age?

Mary Carruthers  opens The Book of Memory, her study of  memory in medieval culture, with a contrast between Thomas Aquinas and Albert Einstein.  Both were regarded as the outstanding intellects of their era; each elicited enthusiastic, wonder-struck praise from his contemporaries.  Carruthers cites a letter by an associate of each man as typical of the praise that each received.  Summing up both she writes:

Of Einstein: ingenuity, intricate reasoning, originality, imagination, essentially new ideas couples with the notion that to achieve truth one must err of necessity, deep devotion to and understanding of physics, obstinacy, vital force, single-minded concentration, solitude.  Of Thomas Aquinas: subtlety and brilliance of intellect, original discoveries coupled with deep understanding of Scripture, memory, nothing forgotten and knowledge ever-increasing, special grace, inward recourse, single-minded concentration, intense recollection, solitude.

Carruthers goes on to note how similar the lists of qualities are “in terms of what they needed for their compositional activity (activity of thought), the social isolation required by each individual, and what is perceived to be the remarkable subtlety, originality, and understanding of the product of such reasoning.”  The difference, appropriate to the object of Carruther’s study, lies in the relationship between memory and the imagination.

Carruthers is eager to remind us that “human beings did not suddenly acquire imagination and intuition with Coleridge, having previously been poor clods.”  But there is a difference in the way these qualities were understood:

The difference is that whereas now geniuses are said to have creative imagination which they express in intricate reasoning and original discovery, in earlier times they were said to have richly retentive memories, which they expressed in intricate reasoning and original discovery.

This latter perspective, the earlier Medieval perspective, is not too far removed from the connections between memory and creativity drawn by Jim Holt based on the experiences of French mathematician, Henri Poincare. We might also note that the changing status of memory within the ecology of genius is owed at least in part to the evolution of technologies which supplement the memory.  Aquinas, working in a culture for which books were still relatively scarce, would have needed a remarkably retentive memory to continue working with the knowledge he acquired through reading.  This becomes less of a priority for post-Gutenberg society.

Mostly, however, Carruthers’ comparison suggested to me the question of what might count for genius in our own time.  We are not nearly so far removed from Einstein as Einstein was from Aquinas, but a good deal has changed nonetheless which makes the question at least plausible.  I suspect that, as was the case between Aquinas and Einstein, there will be a good deal of continuity, a kind of base-line of genius perhaps.  But that baseline makes the shifts in emphasis all the more telling.

I don’t have a particular model for contemporary genius in mind, so this is entirely speculative, but I wonder if today, or in coming years, we might not transfer some of the wonder previously elicited by memory and imagination to something like rapid pattern recognition.  I realize there is significant overlap within these categories.  Just as memory and imagination are related in important ways, so pattern recognition is also implicit in both and has always been an important ability.  So again, it is a matter of emphasis.  But it seems to me that the ability to rapidly recognize, or even generate meaningful patterns from an undifferentiated flow of information may be the characteristic of intelligence most suited to our times.

In Aquinas’ day the emphasis was on the memory needed in order to retain the knowledge which was relatively scarce. In Einstein’s time the emphasis was on the ability to jump out of established patterns of thought generated by abundant, but sometimes static knowledge.  In our day, we are overwhelmed by a torrent of easily available and ever shifting information, we won’t quite say knowledge.  Under these conditions memory loses its pride of place, as does perhaps imagination.  However, the ability to draw together disparate pieces of information or to connect seemingly unrelated points of data into a meaningful pattern that we might count as knowledge now becomes a dimension of human intelligence that may inspire comparable awe and admiration from culture drowning in noise.

Perhaps an analogy to wrap up:  Think of the constellations as instances of pattern recognition.  Lot’s of data points against the night sky drawn into patterns that are meaningful, useful, and beautiful to human beings.  For Aquinas the stars of knowledge might appear but for a moment and to recognize the pattern he had to hold in memory their location as he learned and remembered the location of other stars.  For Einstein many more stars had appeared and they remained steadily in his intellectual field of vision, seeing new patterns were old ones had been established was his challenge.

Today we might say that the night sky is not only full to overflowing, but the configuration is constantly shifting.  Our task is not necessarily to remember the location of few but fading stars, nor is it to see new patterns in a fuller but steady field.  It is to constantly recognize new and possibly temporary patterns in a full and flickering field of information. Those who are able to do this most effectively may garner the kind of respect earlier reserved for the memory of Aquinas and the imagination of Einstein.

For a different, but I think related take on a new form of thinking for our age that draws on the imagery of constellations I encourage you to take a look at this thread at Snark Market.

Technology, Men, and War

We are still learning more about World War II, and much of it is rather depressing.  Recently German researchers have published the transcripts of conversations among German soldiers held as POW’s.  The soldiers and airmen were secretly recorded by the Allies and their conversations offer a disturbingly honest glimpse of the war from the soldier’s perspective.  You can read about the transcripts in a Der Spiegel Online article, “Nazi War Crimes Described by German Soldiers.” Here is an excerpt that caught my attention:

Men love technology, a subject that enables them to quickly find common ground. Many of the conversations revolve around equipment, weapons, calibers and many variations on how the men “whacked,” “picked off” or “took out” other human beings.

The victim is merely the target, to be shot and destroyed — be it a ship, a building, a train or even a cyclist, a pedestrian or a woman pushing a baby carriage. Only in very few cases do the soldiers show remorse over the fate of innocent civilians, while empathy is almost completely absent from their conversations. “The victim in an empathic sense doesn’t appear in the accounts,” the authors conclude.

Be advised the content of the article is at points graphic and disturbing.

Information, A History

This is, as they say, the Information Age.  A days ago I posted a link to a graphic illustrating the exponential growth in data storage capacity, and before that I referenced the figures frequently cited to demonstrate the explosion of data and its trajectory.  But it may be fair to ask what we mean by “information” or “data” anyway.  James Gleik’s new book, The Information: A History, a Theory, a Flood, offers a historical frame of reference that helps us situate our present ideas about what exactly information is and what can be done with it.

Two recent reviews provide a helpful summary of the main themes.  Nicholas Carr’s review, “Drowning in Beeps,” is at The Daily Beast, and Freeman Dyson’s lengthier review, “How We Know,” is in The New York Review of Books.

At the NYRB blog Gleik also wrote a short, engaging post, “The Information Palace,” examining the history of Information via the entry for “information” in Oxford English Dictionary.  Here’s an excerpt:

It’s in the nineteenth century that we start to glimpse the modern sense of the word as a big category, a general thing encompassing news and facts, messages and announcements. The real turning point comes in 1948 when the Bell Labs mathematician and engineer Claude Shannon, in his landmark paper (later a book) “A Mathematical Theory of Communication,” made information, as the OED explains, “a mathematically defined quantity divorced from any concept of news or meaning …” We measure it in bits. We recognize it in words, sounds, images; we store it on the printed page and on polycarbonate discs engraved by lasers and in our genes. We are aware, more or less, that it defines our world.

Memory, Knowledge, Identity, Technology

Memory, knowledge, identity, technology — these are intimately bound together and it would be difficult to disentangle one from the others.  What is it to know something if not to remember it?  Beyond the biological facts of my existence what constitutes my identity more significantly than my memory?  What could I remember without technologies including writing, books, pictures, videos, and more?  Or to put it in a more practical way, what degree of panic might ensue if your Facebook profile were suddenly and irrevocably deleted?  Or if your smart phone were to pass into the hands of another?  Of if you lost your flash drive?  Pushing the clock back just a little, we might have similarly asked about the loss of a diary or photo albums.

The connection among these four, particularly memory and technology, is established as early as the Platonic dialogs, most famously the Phaedrus in which Socrates criticizes writing for its harmful effects on internal memory and knowledge.  What we store in written texts (or hard drives, or “the cloud”) we do not remember ourselves and thus do not truly know it.  The form of this debate recurs throughout the subsequent history of technology all the way to the present debates over the relative merits of computers and the Internet for learning and education.  And in these debates it is almost de rigueur to begin by citing Plato’s Phaedrus either the reinstall or dismiss the Socratic critique.  Neil Postman began his book, Technopoly: The Surrender of Culture to Technology, with reference to Phaedrus, and Phaedrus appears as well in Nicholas Carr’s now (in)famous Atlantic essay, “Is Google Making Us Stupid?”.

The rejoinder comes quickly though:  Surely Socrates failed to appreciate the gains in knowledge that writing would make possible.  And what if I offload information to external memory, this simply frees my mind for more significant tasks. There is, of course, an implicit denigration of mere memory in this rebuttal to Socrates.

Yet some tension, some uneasiness remains.  Otherwise the critique would not continue resurfacing and it wouldn’t elicit such strong push back when it did.  In other words, the critique seems to strike at a nerve, a sensitive one at that, and when again we consider the intimate interrelationship of memory with our ideas about knowledge and education and with the formation and maintenance of our identities it is not surprising at all.  A few posts down I cited Illich’s claim that

What anthropologists distinguish as ‘cultures’ the historian of mental spaces might distinguish as different ‘memories.’  The way to recall, to remember, has a history which is, to some degree, distinct from the history of the substance that is remembered.

I’m wondering now whether it might also be true that a history of personal identity or of individuality could be told through a history of memory and its external supports.  Might we be able to argue that individualism is a function of technologies of memory that allow a person to create and sustain his own history apart from that of the larger society?

In any case, memory has captured my attention and fascinating questions are following hard.  What is memory anyway, what is it to remember a name, a look, a person, a fact, a feeling, where something is, how to do something, or simply to do something?  What do we remember when we remember?  How do we remember?  Why do we remember?  And, of course, how have the answers to all of these questions evolved along with the development of technology from the written word to the external hard drive?

On that last note, I wonder if our choice to call a computer’s capacity to store data “memory” has not in turn affected how we think of our own memory.  I’m especially thinking of a flash drive that we hold in hand and equate with stored memory.  In this device I keep my pictures, my documents, my videos, my memories — memory, or a certain conception of it, is objectified, reified.  Is memory merely mental storage?  Or has this metaphor atrophied our understanding of memory?

Of course, metaphors for memory are nothing new.  I’m beginning to explore some of these ideas with Paul Ricoeur’s Memory, History, Forgetting, and Ricoeur reminds us that in another Platonic dialog, the Theaetetus, Socrates offers the block of wax in our souls as a metaphor for our memory.  And Socrates suggests, “We may look at it, then, as a gift of Mnemosyne [Memory], the mother of the Muses.” I’ll keep you posted as the Muses urge.