My Year, More or Less, in Blogging

It’s not precisely one year on two counts.  To begin with, The Frailest Thing existed in its first iteration as early as September 2009.  I was then a true rookie blogger and working on another platform which shall remain nameless (rhymes with Frogger).  That effort never quite got off the ground.  By June 2010, however, I was ready to try again, and that is when the present version was launched.  Secondly, the first post on this site was published on June 2, so we are actually at a year and a few days.  Nonetheless, being a glutton for nostalgia, I’ve decided to take a retrospective glance back at the past year on the blog.  I realize this will likely be of little interest to anyone but myself, but here it is anyway.

First, some highlights:

The Most Viewed Post:   The Cost of Distraction:  What Kurt Vonnegut Knew

A look at the downside of digital distraction through the lens of Harrison Bergeron, this post was featured on Freshly Pressed over a weekend last August and garnered  not only the most hits on record, but also the most comments.

Runner Up:  Is Sport a Religion?  My first post to be featured on Freshly Pressed was inspired by the World Cup.  At the time, still relatively new to WordPress, I was unaware of the Freshly Pressed feature.  It was a fun surprise.

The Most Viewed Post (Without the Help of Freshly Pressed):  Gods of Love and War

A reflection on technology through the myth of Hephaestus, the lame Greek god of metallurgy.  You’d be surprised how many people search for Hephaestus.

Runner Up: Life Amid the Ruins.  A lot of people search for Vanitas Art as well.

The Most Thoughtful Comments: PerpetuallyFrank

Not that all who comment are not always thoughtful (clearing throat), but I must express my appreciation for the frequent and engaging comments provided by PerpetuallyFrank.  Cheers!

Thanks as well, of course, to all who comment including to those friends who I know will at least read out of some sense of fraternal obligation, but have also generously plugged this blog (Messrs. Ridenhour, Fridsma, Greenwald, and Garcia, for example, among others).

The Most Intriguing Comment Thread: Agitate For Beauty

The aforementioned PerpetuallyFrank and my colleague Chris Friend engaged in a very intriguing exchange on the subject of telepathy.  Go read it for yourself.

The Best Compliment:  Tom Fox

“I have to tell you, Michael, you are one of the best writers I’ve never heard of before. Please take it as a compliment.”  I did. On When Words and Action Part Company.

The Links I’ve Appreciated:  Tie

Thanks to Adam Thierer at The Technology Liberation Front for mentioning me in the same breath as Peggy Noonan and to McLuhan Galaxy for re-posting McLuhan, Chesterton, and the Pursuit of Joy.

In fact, many thanks to all of you who have seen fit to link back here and list The Frailest Thing on your blog rolls.

The Most Underrated Post (By Which I Mean the Post I Rather Liked That Got Relatively Little Traffic):  Tie

Reinvigorating Friendship

Shared Sensibilities

That Was Teaching

It’s not too late, they’re out there, just waiting to be read.

The Most Frequent Search Term Leading Here: “Don Draper” and some variation on Martha Nussbaum

The former presumably leading to Don Draper on Prozac and the latter to The Ends of Learning

The Oddest Search Term Leading Here:  Unmentionable (at least on a classy blog such as this!)

I guess that’s what happens when you have a post titled Gods of Love and War in which you refer to the sordid sex lives of the Greek gods.

Moving on, it is always a bit of a surprise when the author of some piece I’ve blogged about drops a comment.  This has happened on a few occasions, and has usually been positive.  So my thanks to following for dropping in.

Linda Stone and Adam Thierer on Technology Sabbaths and Other Strategies for the Digitized World

Mark D. Bowles on Warning:  A Liberal Education Leads to Independent Thinking

Steve Myers on Finding Digital White Space in a World with 50 Billion Connected Devices

Arikia Millikan on “The Storm is What We Call Progress”

Tom Scocca on Obama Talks with a Computer

Elizabeth Drescher  on Multitasking Monks

And finally, some thoughts.

Some one must have come up with a law of writing whereby the ease of composition varies inversely to the obscurity of the audience.  If not, there it is.  Writing a letter (I know, who am I kidding, just fill in whatever — email, text, etc.) to someone you know:  generally easy.  Writing a blog post to whoever happens to read it:  less so.  It probably doesn’t help matters that I tend to be introspective, perhaps to a fault (case in point).

Writing in a more public venue, however, has forced me to be a little more rigorous with the writing and thinking.  I realize that this is still a rather informal space, but someone may read what I am writing and that generates a sense of responsibility to the reader.  If someone is going to invest a few minutes to read a post (as you are presumably doing right now) I owe it to them to avoid careless or confusing writing.  And besides, on a more self-interested note, no one wants to come off as an idiot when they write something others will read.

As far as the content, the first two or three months featured a wider variety of topics than what I end up posting these days.  Not surprisingly my own context ends up guiding a good deal of the writing process.  I am a  graduate student and so there is a certain compulsion toward writing about what I am reading which tends to revolve around technology, writing and reading, and, lately, memory.  Perhaps I’ll try to expand the scope a bit moving forward.  I’m torn between finding a niche and falling into a rut.  Hopefully, there’s a nice middle ground between the two.

I have also been a teacher for over ten years, s0 on here I hope to make much of what I read in an academic context a little more accessible, which is not to say that I aim to dumb it down.  Ideally, I imagine that there is this broad and generous space between the arcane and the simplistic.  That’s the target area I’m aiming for.

Feel free, of course, to let me know how well I’m managing that!

Cheers, and thanks for reading.  I think I’ll give this a go for another year.

Memory, Writing, Alienation

Some more reflections in interaction with Walter Ong’s work, this time an essay originally published in The Written Word: Literacy in Transition (Oxford, 1986) titled “Writing Is a Technology that Restructures Thought.”

Literacy does its work of transformation by restructuring the cultural and personal economy of memory and installing a self-alienation at the heart of literate identity.

The world of orality is fundamentally evanescent.  Spoken words themselves have begun to pass out of existence before they are fully formed by the speaker’s mouth.  The spoken word is in this way a telling image of oral society; each generation is always already fading into the unremembered past as it inhabits the present.  The accumulated knowledge and wisdom of an oral society exists only as it is remembered by individuals so that each member of the group shares in the cognitive burden of sustaining and transmitting the group’s cultural inheritance.  This work of memory preoccupies the cultural life of oral societies and configures the individual as a node within a network of cultural remembering.  Oral society is thus fundamentally conservative and collective.

Writing disrupts and rearranges this situation by offloading, to a significant degree, the cognitive burden of remembering from the living memory of each individual to the written word.  This work of cognitive offloading generates recurring debates, as we first encounter in Plato, about the proper modes of memory.  These debates reflect the (often unrecognized) force with which new mnenotechnologies impact a society.  As Ong notes, the frozen, lifeless written word is in another, paradoxical sense alive.  It achieves permanence and is “resurrected into limitless living contexts by a limitless number of living readers.”  Furthermore, the “lifeless” written word, by both resourcing and reconfiguring the economy of memory, also injects a new dynamism into literate cultures. It does so by relieving the conservative pressure of cultural remembrance thus encouraging what we might call intellectual entrepreneurship.

This new dynamism is, however, accompanied by various forms of alienation.  Crucially, writing dislodges a portion of one’s memory, a critical aspect of identity, from oneself.  To the extent that identity is constituted by memory, identity must be, to some extent, divided in literate societies.  Ong details the alienating work of writing when he lists fourteen instances of separation effected by writing:

1. Writing separates the known from the knower

2. Writing separates interpretation from data

3. Writing distances the word from sound

4. Writing distances the source of communication from the recipient

5. Writing distances the word from the context of lived experience

6. Due to 5., writing enforces verbal precision unavailable in oral cultures.  (In other words, without the context provided by face-to-face communication, words have to work harder in writing to make meaning clear.  This is why we sometimes feel compelled to use smiley faces in electronic communication — to communicate tone.)

7. Writing separates past from present.

8. Writing separates administration — civil, religious, commercial — from other types of social activities.

9. Writing makes it possible to separate logic from rhetoric.

10. Writing separates academic learning from wisdom.

11. Writing can divide society by splitting verbal communication between a “high” spoken language controlled by writing and a “low” controlled by speech.  (For example, “proper” English is really “written” English, while devalued vulgar and colloquial speech patterns are “spoken” English.)

12. Writing differentiates grapholects, dialect taken over by writing and made into a national language, from other local dialects

13. Writing divides more evidently and effectively as its form becomes more abstract, that is more removed from the world of sound to the world of sight.

14. Writing separates being from time.

By making thought (and so also the self) present to itself, literacy introduces an irreparable fissure into identity and consciousness, but one that is, in Ong’s account, ultimately “humanizing.”  Last word from Ong:

To say writing is artificial is not to condemn it but to praise it . . . By distancing thought, alienating it from its original habitat in sounded words, writing raises consciousness.  Alienation from a natural milieu can be good for us and indeed is in many ways essential for fuller human life.  To live and to understand fully, we need not only proximity but also distance.  This writing provides for, thereby accelerating the evolution of consciousness as nothing else before it does.

Memory, Knowledge, Identity, Technology

Memory, knowledge, identity, technology — these are intimately bound together and it would be difficult to disentangle one from the others.  What is it to know something if not to remember it?  Beyond the biological facts of my existence what constitutes my identity more significantly than my memory?  What could I remember without technologies including writing, books, pictures, videos, and more?  Or to put it in a more practical way, what degree of panic might ensue if your Facebook profile were suddenly and irrevocably deleted?  Or if your smart phone were to pass into the hands of another?  Of if you lost your flash drive?  Pushing the clock back just a little, we might have similarly asked about the loss of a diary or photo albums.

The connection among these four, particularly memory and technology, is established as early as the Platonic dialogs, most famously the Phaedrus in which Socrates criticizes writing for its harmful effects on internal memory and knowledge.  What we store in written texts (or hard drives, or “the cloud”) we do not remember ourselves and thus do not truly know it.  The form of this debate recurs throughout the subsequent history of technology all the way to the present debates over the relative merits of computers and the Internet for learning and education.  And in these debates it is almost de rigueur to begin by citing Plato’s Phaedrus either the reinstall or dismiss the Socratic critique.  Neil Postman began his book, Technopoly: The Surrender of Culture to Technology, with reference to Phaedrus, and Phaedrus appears as well in Nicholas Carr’s now (in)famous Atlantic essay, “Is Google Making Us Stupid?”.

The rejoinder comes quickly though:  Surely Socrates failed to appreciate the gains in knowledge that writing would make possible.  And what if I offload information to external memory, this simply frees my mind for more significant tasks. There is, of course, an implicit denigration of mere memory in this rebuttal to Socrates.

Yet some tension, some uneasiness remains.  Otherwise the critique would not continue resurfacing and it wouldn’t elicit such strong push back when it did.  In other words, the critique seems to strike at a nerve, a sensitive one at that, and when again we consider the intimate interrelationship of memory with our ideas about knowledge and education and with the formation and maintenance of our identities it is not surprising at all.  A few posts down I cited Illich’s claim that

What anthropologists distinguish as ‘cultures’ the historian of mental spaces might distinguish as different ‘memories.’  The way to recall, to remember, has a history which is, to some degree, distinct from the history of the substance that is remembered.

I’m wondering now whether it might also be true that a history of personal identity or of individuality could be told through a history of memory and its external supports.  Might we be able to argue that individualism is a function of technologies of memory that allow a person to create and sustain his own history apart from that of the larger society?

In any case, memory has captured my attention and fascinating questions are following hard.  What is memory anyway, what is it to remember a name, a look, a person, a fact, a feeling, where something is, how to do something, or simply to do something?  What do we remember when we remember?  How do we remember?  Why do we remember?  And, of course, how have the answers to all of these questions evolved along with the development of technology from the written word to the external hard drive?

On that last note, I wonder if our choice to call a computer’s capacity to store data “memory” has not in turn affected how we think of our own memory.  I’m especially thinking of a flash drive that we hold in hand and equate with stored memory.  In this device I keep my pictures, my documents, my videos, my memories — memory, or a certain conception of it, is objectified, reified.  Is memory merely mental storage?  Or has this metaphor atrophied our understanding of memory?

Of course, metaphors for memory are nothing new.  I’m beginning to explore some of these ideas with Paul Ricoeur’s Memory, History, Forgetting, and Ricoeur reminds us that in another Platonic dialog, the Theaetetus, Socrates offers the block of wax in our souls as a metaphor for our memory.  And Socrates suggests, “We may look at it, then, as a gift of Mnemosyne [Memory], the mother of the Muses.” I’ll keep you posted as the Muses urge.

 

“We like lists because we don’t want to die”

It may not look like much, but that grocery list sitting on the kitchen counter is a faint visual echo of the beginnings of civilization.  At least from a certain angle of vision explicated and illustrated in Umberto Eco’s The Infinity of Lists: An Illustrated Essay (2009).  In a Der Spiegel interview from November 2009, Eco explains,

The list is the origin of culture. It’s part of the history of art and literature. What does culture want? To make infinity comprehensible. It also wants to create order — not always, but often. And how, as a human being, does one face infinity? How does one attempt to grasp the incomprehensible? Through lists, through catalogs, through collections in museums and through encyclopedias and dictionaries.

There is a point at which scholars, philosophers, intellectuals (public or otherwise), critics, etc. — one is temped to list on — reach either a certain age or a certain stature, which it is sometimes hard to tell, when they are able to make simple, direct, and yet curiously ambiguous claims and assertions which, had they been made by a lesser figure, would certainly be dismissed out of hand, but coming from the sage achieve a certain matter-of-fact status and attain the aura of profundity.  So, a list of such from Eco:

  1. “We like lists because we don’t want to die.”
  2. “The essential definition is primitive compared with the list.”
  3. “Lists can be anarchistic.”

In fact, read in context, these make a good deal of sense, or at least one sees how they may make sense.  Then one also attains a certain permission to be blunt:

If you interact with things in your life, everything is constantly changing.  And if nothing changes, you’re an idiot.

In passing Eco also manages to make some interesting claims about the Internet:

With context:

SPIEGEL: But you also said that lists can establish order. So, do both order and anarchy apply? That would make the Internet, and the lists that the search engine Google creates, prefect for you.

Eco: Yes, in the case of Google, both things do converge. Google makes a list, but the minute I look at my Google-generated list, it has already changed. These lists can be dangerous — not for old people like me, who have acquired their knowledge in another way, but for young people, for whom Google is a tragedy. Schools ought to teach the high art of how to be discriminating.

I appreciated Eco’s distinction between modes of knowledge acquisition, which can make all the difference.  Sometimes those who were trained on the older model and subsequently enter the digital world fail to appreciate how their cognitive position and sensibility are different from those who are, as they say, born digital.

One last observation from Eco,

My interests change constantly, and so does my library. By the way, if you constantly change your interests, your library will constantly be saying something different about you.

This along with Eco’s ruminations about lists as a means of holding off the specter of death and creating order from chaos echoed (pun hesitantly intended, although technically, foreshadowed) Nathan Schneider’s excellent “In Defense of The Memory Theater” from some months ago.  I’ve recommended before, and I’ll do so again as my own thinking and interests, along with the books around me, take a turn toward memory.  From Schneider:

Ever since the habit of writing first took hold of me as a teenager, I knew precisely why I did it, and why I did it so compulsively: to hedge against the terror of having a terrible memory. Though still young enough to expect no sympathy, I constantly feel the burden of this handicap. Confirmation of it, and that writing is its cure, I discover every time I pick up something I wrote years, or even months ago. Reading those things puts me in an uncanny state, like a past-life regression. Meanwhile, unrecorded impressions, sayings, old friends, and good books vanish without warning or trace. Some read and write to win eternal life; I would be happy enough just to keep a hold of this one.

Writing birthed lists and lists yielded annals and annals, history — personal and cultural.

“You have to be somebody before you can share yourself”

Will the Internet make it impossible to make clean starts in life?  Will every word and every picture we have posted, however ill advised, find a way to haunt us?  This is the fear legal scholar Jeffrey Rosen articulated in his NY Times piece, “The Web Means the End of Forgetting.” That is the same piece that led me to wonder several days ago if we might need a new economic statistic to track social media induced unemployment.

According to David Dylan Thomas at In Medias Res, the real problem might actually be the opposite of what Rosen feared.  In a post titled “The Myth of Online Inertia,” Thomas argues that “things disappear from the web” all the time.  The evolution of hardware and file formatting renders much of what is produced potentially inaccessible with the passage of time.  “As a web manager,” Thomas explains,

I’ve overseen the overhaul of many a content management system, and there’s always a compatibility issue which forces editors and technology teams to ask the same question. How much? How much will it cost (in time and money) to convert how much information? Do we really want to bother reformatting 400 news stories that were published in 2000 to a whole new format on the off chance that someone will search for them? The answer is almost always no. And that’s just 10 years.

My sense is that both Rosen and Thomas are on to something and that if they were to sit down together to discuss their positions, a synthesis preserving elements of both arguments would emerge.  Regardless of how powerful the Internet’s long term memory proves to be, however, its short term memory is quite good and potentially quite damaging.  Consequently, we are becoming increasingly self-conscious and cautious about what we post and where.

Some are concerned enough to implement tools such as Google Goggles that are designed to keep us from sending that stupid drunken email that ends up costing us our job or a relationship.  A great deal of time and money is also being spent to keep individuals from not only ruining their own reputations with a misguided tweet, but also tarnishing the image of the institutions with which they are associated.  In a recent story about the effort colleges are putting forth to manage the social media activities of their student athletes, a consultant gave the very basic rule he tries to instill in student athletes:  if you would have a problem with your mother reading or seeing it, don’t post it.

This is good advice as far as it goes, I suppose.  Although, it would depend a great deal, wouldn’t it, on the sensibilities of each particular mom.  In any case, this all brought to mind a recent article in The New Republic by Jed Perl.  In “Alone, With Words,” Perl laments the loss of writing that  begins as and remains a private act.

Writing, before it is anything else, is a way of clarifying one’s thoughts. This is obviously true of forms such as the diary, which are inherently solitary. But even those of us who write for publication can conclude, once we have clarified certain thoughts, that these thoughts are not especially valuable, or are not entirely convincing, or perhaps are simply not thoughts we want to share with others, at least not now … I believe that most writing worth reading is the product, at least to some degree, of this extraordinarily intimate confrontation between the disorderly impressions in the writer’s mind and the more or less orderly procession of words that the writer manages to produce on the page.

Most of what is made public in the arena of social media was never private in Perl’s sense, at least not for very long at all.  We are becoming used to the idea of providing a more or less real time feed of our thoughts and actions to the world.  The process of clarification and crafting that Perl describes has been replaced by the urge to publicize immediately.  Little wonder then that some of what we make public is damning and much of it is quite inane.

Citing Jaron Lanier, Alan Jacobs makes a point that we seem to have forgotten:

“You have to be somebody before you can share yourself.” And the process of becoming somebody takes time, effort, discipline, and study.

That process also tends to happen when we have preserved  a certain private space for our selves.  Social media and the Internet have given us an unparalleled ability to make our thoughts, our writing, our  pictures, our very selves public.  Our task now may be to carve out and preserve a private space that will help render what we make public meaningful and worthwhile.  Or, at least not potentially disastrous.