Writing Cannot Be Taught, But It Can Be Learned

Serendipitously, I encountered two book reviews yesterday containing very sound advice on the art of writing.  I expected as much out of the first review in which Joseph Epstein, whose prose I’ve long admired, politely begs to differ with Stanley Fish on how one ought to go about the process of writing well.  The second piece, a review of two books on higher education by Louis Menand, himself a more than competent stylist, offered its comments on writing incidentally to its main point.  I took both to be very near the truth of the matter.

Epstein opens his review of Stanley Fish’s How to Write a Sentence with the following  observations:

After thirty years of teaching a university course in something called advanced prose style, my accumulated wisdom on the subject, inspissated into a single thought, is that writing cannot be taught, though it can be learned—and that, friends, is the sound of one hand clapping. A. J. Liebling offers a complementary view, more concise and stripped of paradox, which runs: “The only way to write is well, and how you do it is your own damn business.”

Learning to write sound, interesting, sometimes elegant prose is the work of a lifetime. The only way I know to do it is to read a vast deal of the best writing available, prose and poetry, with keen attention, and find a way to make use of this reading in one’s own writing. The first step is to become a slow reader. No good writer is a fast reader, at least not of work with the standing of literature. Writers perforce read differently from everyone else. Most people ask three questions of what they read: (1) What is being said? (2) Does it interest me? (3) Is it well constructed? Writers also ask these questions, but two others along with them: (4) How did the author achieve the effects he has? And (5) What can I steal, properly camouflaged of course, from the best of what I am reading for my own writing? This can slow things down a good bit.

A bit further into the review he adds:

First day of class I used to tell students that I could not teach them to be observant, to love language, to acquire a sense of drama, to be critical of their own work, or almost anything else of significance that comprises the dear little demanding art of putting proper words in their proper places. I didn’t bring it up, lest I discourage them completely, but I certainly could not help them to gain either character or an interesting point of view. All I could do, really, was point out their mistakes, and, as someone who had read much more than they, show them several possibilities about deploying words into sentences, and sentences into paragraphs, of which they might have been not have been aware. Hence the Zenish koan with which I began: writing cannot be taught, but it can be learned.

In “Live and Learn:  Why We Have College”, Louis Menand reviews two recent books on the state of higher education and along the way offers his own well-considered thoughts  on the subject.  One of the two books, In the Basement of the Ivory Tower: Confessions of an Accidental Academic (which began its life as an article in The Atlantic)  contained, in Menand’s estimation, sound thoughts on writing:

When he is not taking on trends in modern thought, Professor X is shrewd about the reasons it’s hard to teach underprepared students how to write. “I have come to think,” he says, “that the two most crucial ingredients in the mysterious mix that makes a good writer may be (1) having read enough throughout a lifetime to have internalized the rhythms of the written word, and (2) refining the ability to mimic those rhythms.” This makes sense. If you read a lot of sentences, then you start to think in sentences, and if you think in sentences, then you can write sentences, because you know what a sentence sounds like. Someone who has reached the age of eighteen or twenty and has never been a reader is not going to become a writer in fifteen weeks.

For Menand and Epstein, the secret, if there is one, of good writing appears to be attentive reading, and a lot of it.

Teaching What It Feels Like To Be Alive

… it’s the stuff that’s about what it feels like to live.  Instead of being a relief from what it feels like to live.

That is how David Foster Wallace, in Although of Course You End Up Becoming Yourself, contrasted traditional literature with its coherent narrative and a satisfying sense of closure, to experimental or avant-garde literature which typically exhibits neither.  I’ve been thinking about that contrast since I posted the passage a few weeks ago.  Writing that is experienced as a relief from what it feels like to be alive and writing that reflects what it feels like to be alive — I’m wondering if that same distinction could also be usefully applied to teaching.  Can teaching, in the same way, reflect what it feels like to be alive, rather than be a relief from it?

Literature and teaching are both components of the ongoing, ramshackle project we call our education.   When I am most hopeful about what a teacher can do, I see it as not unlike what a very good book might also accomplish.  We might describe it as the opening up of new and multiple vistas into both the world and ourselves.  A good book offers a challenging engagement with reality, rather than the mere escapism that some literature proffers instead.  To borrow a line from Bridge to Terabithia, good teaching, likewise, pushes students to see beyond their own secret countries, to see and to feel what lies beyond and within.  Of course, on my less hopeful (read, more curmudgeonly) days, I feel that convincing students that a book can work in that way is itself the necessary task.

What, then, might it mean to teach so as to reflect what it feels like to be alive?

For one thing, it involves feeling; it is affective.  It reaches beyond the transfer of information to the mind, and seeks to move the heart as well.  This matters principally because while we go about the work and play of living we tend to lead with our hearts and not with our minds (for better and/or for worse).

But in order to move the heart, the heart must be susceptible to being moved.  The numbness that threatens always to settle on us as wave upon wave of stimulation washes over us gently massaging us into a state of mildly amused indifference to reality must be overcome.  This numbness itself might be self-protective, but, while self-knowledge has a distinguished place in the history of education, self-preservation seems a less noble aspiration.  Teaching that leads to feeling must find a way to break this through this self-protective numbness.  Of course, that numbness is itself part of what it feels like to be alive, but it is the part that must first be encountered, acknowledged, and transcended in order to feel all the rest.

Like the artist in Wallace’s view, the teacher has the license and the responsibility

to sit, clench their fists, and make themselves be excruciatingly aware of the stuff that we’re mostly aware of only on a certain level.  And that if the writer [or teacher] does his job right, what he basically does is remind the reader [or student] of how smart the reader [or student] is.

The teacher, like the writer, must themselves be sensitive to what it feels like to be alive so as to teach to that feeling and help students understand it, understand themselves.  Perhaps it is precisely here that teaching has failed students, in the inability to enter into the student’s world so as to speak meaningfully into it.

The trick, of course, is also to do so without falling into the equivalent of what Wallace calls “shitty avant garde,” literature that tries too hard and ignores the reader in its effort to be profound. Trying too hard to achieve this effect without authenticity is fatal.  Likewise with teaching.  Watching Lean on Me or Dead Poet’s Society one too many times will likely do more harm than good.

Good writing and good teaching are both grounded in a deep respect for the reader and the student, not in an inordinate desire to be inspiring.  This is what finally stuck me most forcefully in Wallace’s comments.  His work, his estimation of what literature could do, flowed from a remarkable confidence in the reader.  Perhaps then this is also where good teaching must begin, with an equal respect for and confidence in the student.

Pattern Recognition: The Genius of our Time?

What counts for genius in our times?  Is it the same as what has always counted for genius?  Or, are there shifting criteria that reflect the priorities and affordances of a particular age?

Mary Carruthers  opens The Book of Memory, her study of  memory in medieval culture, with a contrast between Thomas Aquinas and Albert Einstein.  Both were regarded as the outstanding intellects of their era; each elicited enthusiastic, wonder-struck praise from his contemporaries.  Carruthers cites a letter by an associate of each man as typical of the praise that each received.  Summing up both she writes:

Of Einstein: ingenuity, intricate reasoning, originality, imagination, essentially new ideas couples with the notion that to achieve truth one must err of necessity, deep devotion to and understanding of physics, obstinacy, vital force, single-minded concentration, solitude.  Of Thomas Aquinas: subtlety and brilliance of intellect, original discoveries coupled with deep understanding of Scripture, memory, nothing forgotten and knowledge ever-increasing, special grace, inward recourse, single-minded concentration, intense recollection, solitude.

Carruthers goes on to note how similar the lists of qualities are “in terms of what they needed for their compositional activity (activity of thought), the social isolation required by each individual, and what is perceived to be the remarkable subtlety, originality, and understanding of the product of such reasoning.”  The difference, appropriate to the object of Carruther’s study, lies in the relationship between memory and the imagination.

Carruthers is eager to remind us that “human beings did not suddenly acquire imagination and intuition with Coleridge, having previously been poor clods.”  But there is a difference in the way these qualities were understood:

The difference is that whereas now geniuses are said to have creative imagination which they express in intricate reasoning and original discovery, in earlier times they were said to have richly retentive memories, which they expressed in intricate reasoning and original discovery.

This latter perspective, the earlier Medieval perspective, is not too far removed from the connections between memory and creativity drawn by Jim Holt based on the experiences of French mathematician, Henri Poincare. We might also note that the changing status of memory within the ecology of genius is owed at least in part to the evolution of technologies which supplement the memory.  Aquinas, working in a culture for which books were still relatively scarce, would have needed a remarkably retentive memory to continue working with the knowledge he acquired through reading.  This becomes less of a priority for post-Gutenberg society.

Mostly, however, Carruthers’ comparison suggested to me the question of what might count for genius in our own time.  We are not nearly so far removed from Einstein as Einstein was from Aquinas, but a good deal has changed nonetheless which makes the question at least plausible.  I suspect that, as was the case between Aquinas and Einstein, there will be a good deal of continuity, a kind of base-line of genius perhaps.  But that baseline makes the shifts in emphasis all the more telling.

I don’t have a particular model for contemporary genius in mind, so this is entirely speculative, but I wonder if today, or in coming years, we might not transfer some of the wonder previously elicited by memory and imagination to something like rapid pattern recognition.  I realize there is significant overlap within these categories.  Just as memory and imagination are related in important ways, so pattern recognition is also implicit in both and has always been an important ability.  So again, it is a matter of emphasis.  But it seems to me that the ability to rapidly recognize, or even generate meaningful patterns from an undifferentiated flow of information may be the characteristic of intelligence most suited to our times.

In Aquinas’ day the emphasis was on the memory needed in order to retain the knowledge which was relatively scarce. In Einstein’s time the emphasis was on the ability to jump out of established patterns of thought generated by abundant, but sometimes static knowledge.  In our day, we are overwhelmed by a torrent of easily available and ever shifting information, we won’t quite say knowledge.  Under these conditions memory loses its pride of place, as does perhaps imagination.  However, the ability to draw together disparate pieces of information or to connect seemingly unrelated points of data into a meaningful pattern that we might count as knowledge now becomes a dimension of human intelligence that may inspire comparable awe and admiration from culture drowning in noise.

Perhaps an analogy to wrap up:  Think of the constellations as instances of pattern recognition.  Lot’s of data points against the night sky drawn into patterns that are meaningful, useful, and beautiful to human beings.  For Aquinas the stars of knowledge might appear but for a moment and to recognize the pattern he had to hold in memory their location as he learned and remembered the location of other stars.  For Einstein many more stars had appeared and they remained steadily in his intellectual field of vision, seeing new patterns were old ones had been established was his challenge.

Today we might say that the night sky is not only full to overflowing, but the configuration is constantly shifting.  Our task is not necessarily to remember the location of few but fading stars, nor is it to see new patterns in a fuller but steady field.  It is to constantly recognize new and possibly temporary patterns in a full and flickering field of information. Those who are able to do this most effectively may garner the kind of respect earlier reserved for the memory of Aquinas and the imagination of Einstein.

For a different, but I think related take on a new form of thinking for our age that draws on the imagery of constellations I encourage you to take a look at this thread at Snark Market.

‘The Connecting Is the Thinking’: Memory and Creativity

Last summer Nicholas Carr published The Shallows: What the Internet Is Doing to Our Brains, a book length extension of his 2008 Atlantic essay, “Is Google Making Us Stupid?” The book received a good bit of attention and was in the ensuing weeks reviewed seemingly everywhere.  We noted a few of those reviews here and here.  Coming in fashionably late to the show, Jim Holt has written a lenghty review in the London Review of Books titled, “Smarter, Happier, More Productive.” Perhaps a little bit of distance is helpful.

Holt’s review ends up being one of the better summaries of Carr’s book that I have read, if only because Holt details more of the argument than most reviews.  In the end, he tends to think that Carr is stretching the evidence and overstating his case on two fronts, intelligence and happiness. However, he is less sanguine on one last point, creativity, and that in relation to memory.

Holt cites two well known writers who are optimistic about off-loading their memories to the Internet:

This raises a prospect that has exhilarated many of the digerati. Perhaps the internet can serve not merely as a supplement to memory, but as a replacement for it. ‘I’ve almost given up making an effort to remember anything,’ says Clive Thompson, a writer for Wired, ‘because I can instantly retrieve the information online.’ David Brooks, a New York Times columnist, writes: ‘I had thought that the magic of the information age was that it allowed us to know more, but then I realised the magic of the information age is that it allows us to know less. It provides us with external cognitive servants – silicon memory systems, collaborative online filters, consumer preference algorithms and networked knowledge. We can burden these servants and liberate ourselves.’

But as Holt notes, “The idea that machine might supplant Mnemosyne is abhorrent to Carr, and he devotes the most interesting portions of his book to combatting it.”  Why not outsource our memory?

Carr responds with a bit of rhetorical bluster. ‘The web’s connections are not our connections,’ he writes. ‘When we outsource our memory to a machine, we also outsource a very important part of our intellect and even our identity.’ Then he quotes William James, who in 1892 in a lecture on memory declared: ‘The connecting is the thinking.’ And James was onto something: the role of memory in thinking, and in creativity.

Holt goes on to supplement Carr’s discussion with an anecdote about the polymathic French mathematician, Henri Poincare.  What makes Poincare’s case instructive is that “his breakthroughs tended to come in moments of sudden illumination.”

Poincaré had been struggling for some weeks with a deep issue in pure mathematics when he was obliged, in his capacity as mine inspector, to make a geological excursion. ‘The changes of travel made me forget my mathematical work,’ he recounted.

“Having reached Coutances, we entered an omnibus to go some place or other. At the moment I put my foot on the step the idea came to me, without anything in my former thoughts seeming to have paved the way for it, that the transformations I had used to define the Fuchsian functions were identical with those of non-Euclidean geometry. I did not verify the idea; I should not have had time, as, upon taking my seat in the omnibus, I went on with a conversation already commenced, but I felt a perfect certainty. On my return to Caen, for conscience’s sake, I verified the result at my leisure.”

How to account for the full-blown epiphany that struck Poincaré in the instant that his foot touched the step of the bus? His own conjecture was that it had arisen from unconscious activity in his memory.

This leads Holt to suggest, following Poincare, that bursts of creativity and insight arise from the unconscious work of memory, and that this is the difference between internalized and externalized memory.  We may be able to retrieve at will whatever random piece of information we are looking for with a quick Google search, but that seems not to approach the power of the human mind to creatively and imaginatively work with its stores of memory.  Holt concludes:

It is the connection between memory and creativity, perhaps, which should make us most wary of the web. ‘As our use of the web makes it harder for us to lock information into our biological memory, we’re forced to rely more and more on the net’s capacious and easily searchable artificial memory,’ Carr observes. But conscious manipulation of externally stored information is not enough to yield the deepest of creative breakthroughs: this is what the example of Poincaré suggests. Human memory, unlike machine memory, is dynamic. Through some process we only crudely understand – Poincaré himself saw it as the collision and locking together of ideas into stable combinations – novel patterns are unconsciously detected, novel analogies discovered. And this is the process that Google, by seducing us into using it as a memory prosthesis, threatens to subvert.

And this leads me to make one additional observation.  As I’ve mentioned before, it is customary in these discussions to refer back to Plato’s Phaedrus in which Socrates warns that writing, as an externalization of memory, will actually lead to the diminishing of human memory.  Holt mentions the passage in his review and Carr mentions it as well.  When the dialog is trotted out it is usually as a “straw man”  to prove that concerns about new technologies are silly and misguided.  But it seems to me that there is a silent equivocation that slips into these discussions: the notion of memory we tend to assume is our current understanding of memory that is increasingly defined by the comparison to computer memory which is essentially storage.

It seems to me that having first identified a computer’s storage  capacity as “memory,” a metaphor dependent upon the human capacity we call “memory,” we have now come to reverse the direction of the metaphor by understanding human “memory” in light of a computer’s storage capacity.  In other words we’ve reduced our understanding of memory to mere storage of information.  And now we read all discussions of memory in light of this reductive understanding.

Given this reductive view of memory, it seems silly for Socrates (and by extension, Plato) to worry about the externalization of memory, whether it is stored inside or outside, what difference does it make as long as we can access it?  And, in fact, access becomes the problem that attends all externalized memories from the book to the Internet.  But what if memory is not mere storage?  Few seem to extend their analysis to account for the metaphysical role memory of the world of forms played within Plato’s account of the human person and true knowledge.  We may not take Plato’s metaphysics at face value, but we can’t really understand his concerns about memory without understanding their lager intellectual context.

Holt helps us to see the impoverishment of our understanding of memory from another, less metaphysically freighted, perspective.  The Poincare anecdote in its own way also challenges the reduction of memory to mere storage, linking it with the complex workings of creativity and insight.  Others have similarly linked memory to identity, wisdom, and even, in St. Augustine’s account, our understanding of the divine.  Whether one veers into the theological or not, the reduction of memory to mere storage of data should strike us as an inadequate account of memory and its significance and cause us to rethink our readiness to offload it.

Update:  Post from Carr on the issue of memory including a relevant excerpt from The Shallows, “Killing Mnemosyne.”

Memory, Knowledge, Identity, Technology

Memory, knowledge, identity, technology — these are intimately bound together and it would be difficult to disentangle one from the others.  What is it to know something if not to remember it?  Beyond the biological facts of my existence what constitutes my identity more significantly than my memory?  What could I remember without technologies including writing, books, pictures, videos, and more?  Or to put it in a more practical way, what degree of panic might ensue if your Facebook profile were suddenly and irrevocably deleted?  Or if your smart phone were to pass into the hands of another?  Of if you lost your flash drive?  Pushing the clock back just a little, we might have similarly asked about the loss of a diary or photo albums.

The connection among these four, particularly memory and technology, is established as early as the Platonic dialogs, most famously the Phaedrus in which Socrates criticizes writing for its harmful effects on internal memory and knowledge.  What we store in written texts (or hard drives, or “the cloud”) we do not remember ourselves and thus do not truly know it.  The form of this debate recurs throughout the subsequent history of technology all the way to the present debates over the relative merits of computers and the Internet for learning and education.  And in these debates it is almost de rigueur to begin by citing Plato’s Phaedrus either the reinstall or dismiss the Socratic critique.  Neil Postman began his book, Technopoly: The Surrender of Culture to Technology, with reference to Phaedrus, and Phaedrus appears as well in Nicholas Carr’s now (in)famous Atlantic essay, “Is Google Making Us Stupid?”.

The rejoinder comes quickly though:  Surely Socrates failed to appreciate the gains in knowledge that writing would make possible.  And what if I offload information to external memory, this simply frees my mind for more significant tasks. There is, of course, an implicit denigration of mere memory in this rebuttal to Socrates.

Yet some tension, some uneasiness remains.  Otherwise the critique would not continue resurfacing and it wouldn’t elicit such strong push back when it did.  In other words, the critique seems to strike at a nerve, a sensitive one at that, and when again we consider the intimate interrelationship of memory with our ideas about knowledge and education and with the formation and maintenance of our identities it is not surprising at all.  A few posts down I cited Illich’s claim that

What anthropologists distinguish as ‘cultures’ the historian of mental spaces might distinguish as different ‘memories.’  The way to recall, to remember, has a history which is, to some degree, distinct from the history of the substance that is remembered.

I’m wondering now whether it might also be true that a history of personal identity or of individuality could be told through a history of memory and its external supports.  Might we be able to argue that individualism is a function of technologies of memory that allow a person to create and sustain his own history apart from that of the larger society?

In any case, memory has captured my attention and fascinating questions are following hard.  What is memory anyway, what is it to remember a name, a look, a person, a fact, a feeling, where something is, how to do something, or simply to do something?  What do we remember when we remember?  How do we remember?  Why do we remember?  And, of course, how have the answers to all of these questions evolved along with the development of technology from the written word to the external hard drive?

On that last note, I wonder if our choice to call a computer’s capacity to store data “memory” has not in turn affected how we think of our own memory.  I’m especially thinking of a flash drive that we hold in hand and equate with stored memory.  In this device I keep my pictures, my documents, my videos, my memories — memory, or a certain conception of it, is objectified, reified.  Is memory merely mental storage?  Or has this metaphor atrophied our understanding of memory?

Of course, metaphors for memory are nothing new.  I’m beginning to explore some of these ideas with Paul Ricoeur’s Memory, History, Forgetting, and Ricoeur reminds us that in another Platonic dialog, the Theaetetus, Socrates offers the block of wax in our souls as a metaphor for our memory.  And Socrates suggests, “We may look at it, then, as a gift of Mnemosyne [Memory], the mother of the Muses.” I’ll keep you posted as the Muses urge.