McLuhan: 100

The medium is the message … five words, plump and alliterative though they may be, are wildly inadequate … he was born in Edmonton, Alberta on July 21, 1911 … He speaks in canned riddles … Speech as organized stutter is based on time. What does speech do to space? … “Clear prose indicates the absence of thought” … Watching McLuhan, you can’t quite decide whether he was a genius or just had a screw loose … he gave us language that made “media” into a thing …

It feels wistful to imagine a time when people didn’t go about their daily routine with the assumption that at any moment another massive media technology will be dumped on us by some geek in California … “I’m going to be a computer when I grow up” …

“What if he is right”? … “Instead of the book as a fixed package of repeatable and uniform character suited to the market with pricing, the book is increasingly taking on the character of a service … and the book as an information service is tailor-made and custom-built” … First of all – and I’m sorry to have to repeat this disclaimer – I’m not advocating anything … “The next medium, whatever it is – it may be the extension of consciousness – will include television as its content, not as its environment” …

an alchemical mix of his vast historical and literary knowledge, his bombastic personality and a range of behaviors we might now place on the very mild end of the autistic spectrum … McLuhan’s mind was probably situated at the mild end of the autism spectrum. He also suffered from a couple of major cerebral traumas …

First, that McLuhan never made arguments, only assertions … a fixture of culture both nerd and pop, which are increasingly the same thing. He is the patron saint of Wired … what mattered was merely the fact that you were watching TV. The act of analysing the content of TV – or of other mediums – is either sentimental or it’s beside the point … Annie Hallthe fastest brain of anyone I have ever met, and I never knew whether what he was saying was profound or garbage… He wanted his words to knock readers out of their intellectual comfort zones, to get them to entertain the possibility that their accepted patterns of perception might need reordering ..McLuhan was an information canary …

“He writes by paradox — that makes him hard to read (or hard on the reader),” wrote McLuhan … he loved Chesterton’s rhetorical flourishes, imbibed his playfulness, turned his impulse to try out new combinations of ideas into the hallmark of the McLuhan method … He became a daily Mass-goer …

There is absolutely no inevitability … what will be the psychic fallout of these technologies on our inner lives? … Like Marx and Freud, he was an intellectual agitator, a conceptual mind expander, the yeast in the dough …  James Joyce and Ezra Pound especially … The web. The web, with its feeds and flows and rivers and streams … That kaleidoscopic, almost psychedelic style … In that Playboy interview … a celebrity-seeking charlatan …

lost all hope “that the world might become a better place with new technology” …  people who classify McLuhan as a techno-utopian aren’t simply making stuff up … Resenting a new technology will not halt its progress … Many people seem to think that if you talk about something recent, you’re in favor of it … And so eschatological hope appears as nothing more than an early manifestation of cyber-utopianism … Look at what these media are doing to our souls … “Once we have surrendered our senses and nervous systems to the private manipulation of those who would try to benefit by taking a lease on our eyes and ears and nerves, we don’t really have any rights left” …

Your question reflects the usual panic of people confronted with unexplored technologies. I’m not saying such panic isn’t justified … merely that such reactions are useless and distracting … “Man the food-gatherer reappears incongruously as information-gatherer” … But an understanding of media’s effects constitutes a civil defense against media fallout … someone who didn’t just have strong ideas but who invented a whole new way of talking … all a teacher can ever do is get people to think …

outlived his fame … he died in a state of wordlessness …

That’s what McLuhan did.

_______________________________________________________________________

In case it is not apparent, only a very few of these words are mine.  Sources:

Webs and whirligigs:  Marshall McLuhan in his time and ours by Megan Garber
Why McLuhan’s chilling vision still matters today by Douglas Coupland
McLuhan at 100 and McLuhan on the Cloud by Nicholas Carr
Why Bother With Marshall McLuhan by Alan Jacobs
Divine Inspiration by Jeet Heer
Marshall McLuhan:  Escape into Understanding by W. Terrence Gordon
McLuhan, Chesterton, and the Pursuit of Joy
McLuhan as Teacher by Walter Ong

Kevin Kelly, God, and Technology

As I have read and thought about technology and its cultural consequences, I have especially appreciated the work of Marshall McLuhan, Walter Ong, Jacques Ellul, Ivan Illich, and Albert Borgmann.  My appreciation stems not only from the quality and originality of their work, but also from a curiosity about the manner in which their religion informed their thinking; all were deeply committed to some expression of the Christian faith.  We would do well to add the name of Kevin Kelly to the list of theorists and students of technology who bring a theological perspective to their work.

Of course, the Christian tradition is an ocean with many currents, and so it is not surprising that despite their common core commitments, the work of the scholars mentioned each takes on a distinct hue.  Of those mentioned, Kelly is in my estimation the most optimistic about the future of technology and that comes across quite clearly in his recent interview with Christianity Today.

There Kelly connects technology with God’s own creative capacity and the freedom with which He endows humanity:

We are here to surprise God. God could make everything, but instead he says, “I bestow upon you the gift of free will so that you can participate in making this world. I could make everything, but I am going to give you some spark of my genius. Surprise me with something truly good and beautiful.”

He also provides the following explanation of the term technium which he coined:

I use technium to emphasize that human creation is more than the sum of all its parts. An ecosystem behaves differently from its individual plant and animal components. We have thoughts in our minds that are more than the sum of all neuron activity. Society itself has certain properties that are more than the sum of the individuals; there is an agency that’s bigger than us. In the same way, the technium will have a behavior that you’re not going to find in your iPhone or your light bulb alone. The technium has far more agency than is suggested by the word culture.

I find this emergent model to be an interesting way to get at the influence of technology.  I try to navigate a path between approaches to technology that take the tools to be determinative of human action on the one hand, and others which take the tools to be merely neutral objects of human action on the other.  I’m not sure if I’m prepared to unreservedly endorse Kelly’s formulation, but I am generally sympathetic.

I’m less inclined to sign onto the remarkably positive outlook Kelly articulates for the technium, although I must admit that it is both refreshing and invigorating. Kelly is sure that “… the world is a better place now than it was 1,000 years ago. Whatever quantifiable metric you want to give to me about what’s good in life, I would say there’s more of it now than there was 1,000 years ago.” And, indeed, by many if not most measures, it most certainly is.  Yet, I would hesitate to claim that in every way that life has improved it has done so because of the technium, and I would be inclined to argue that in certain important respects elements of the technium have worked against human happiness and fulfillment.

Kelly acknowledges, but underemphasizes the fallibility and folly of humanity. He believes that God’s grace, seemingly operating through the technium, more than cancels out the folly.  I share the hope in principle, but would not so closely connect the operations of God’s grace to the sphere of technological advance.

Perhaps the point of tension that I experience with Kelly’s position stems from his definition of goodness:  “… overall the technium has a positive force, a positive charge of good. And that good is primarily measured in terms of the possibilities and choices it presents us with.”  Kelly illustrates his point by asking us to imagine Mozart being born into a world in which the piano has not been invented – what a tragedy.  This resonates, but then we might ask, what of all of those would be Mozarts that did in fact live, as surely they did.  Is their happiness and fulfillment so tied to an as of yet future invention that their life is otherwise rendered unfulfilled?  Would this not suggest that, in fact, the grass is always greener in the future perpetually and so happiness and fulfillment is never finally attainable?  Fulfillment would taunt us from just around the corner that is the future.

Perhaps the problem arises from too quickly eliding the infinite creative possibilities of the Creator with the limited, derivative creativity of the creature.  To be human is to flourish within the limitations of material and embodied existence. Expanding choice is not necessarily a bad thing, of course, but hitching the possibility of human fulfillment to the relentless expansion of choice seems to overlook the manner in which the voluntary curtailment of choice might also serve as the path to a well-lived life.

Curiously, Kelly practices a way of life that would seem on the surface to be at odds with the gospel of choice maximalization.  He has written engagingly about the Amish and recommended aspects of their approach to technology.  In his personal life, Kelly has implemented a good bit of Amish minimalism.  When asked about whether this constituted an inconsistency between his words and his actions, Kelly responded:

Technology can maximize our special combination of gifts, but there are so many technological choices that I could spend all my time just trying out technologies. So I minimize my technological choices in order to maximize my output. The Amish (and the hippies) are really good at minimizing technologies. That’s what I am trying to do as well. I seek to find those technologies that assist me in my mission to express love and reflect God in the world, and then disregard the rest.

But at the same time, I want to maximize the pool of technologies that people can choose from, so that they can find those tools that maximize their options and minimize the rest.

I can see his angle and would stop short of suggesting that this was indeed an inconsistency on Kelly’s part, but I will say that for my part I find more wisdom in Kelly’s practice than in his unbounded hope for the technium.

______________________________________________________

See also Nicholas Carr’s comments on Kelly’s interview (as well as Kelly’s response in the comment thread) and Kevin Kelly’s TED Talk.

Anonymity is Authenticity?

Chris Poole is among the 21 New Media Innovators recently profiled by New York Magazine.  They bestowed upon him the title of “Meme Generator” and provided this short bio:

Chris Poole (handle: “moot”) founded the anonymous message board 4Chan when he was just 15. It’s grown into the breeding ground for some of the web’s most pervasive memes, as well as some of its more ominous movements. In the last two years, Poole has raised more than $3 million in venture funding for a new image-centric site called Canvas, which is similar to but separate from 4chan, and he’s become an advocate for web privacy. At this year’s South by Southwest conference, for instance, he had this to say: “Zuckerberg’s totally wrong on anonymity being total cowardice. Anonymity is authenticity.”

It was that last line that caught my attention.  I’ve lately been wrestling with the relative virtues and vices of anonymity and personalization.  A while ago I argued that Zuckerberg was indeed wrong about identity, and self-servingly so.  I wasn’t interested in defending anonymity in that case, but rather the more natural gradual and contingent self-disclosure that characterizes ordinary human relationships.

In the past several days I’ve returned to the theme arguing first that Google+ through its Circles attempts to address Facebook’s “all friends are equal” model.  Then I suggested that the trend toward personalization and away from the anonymity of the early web better (but, imperfectly) fitted our social impulse and operated to some degree as what  sociologists call a mediating structure. I followed that up with a post commenting on Morozov’s lament for the loss of the early Internet’s communal character which, I suggested, sat uncomfortably with Morozov’s privileging of privacy over personalization.  Human communities don’t ordinarily function on those terms.  Oddly, I argued in the comment thread of that same post for the desirability of anonymity in some instances.

So now I come across Poole’s claim — “anonymity is authenticity” — and feel primed to comment.  My initial response is this:  If anonymity is authenticity, then it is a Pyrrhic authenticity.

There is a certain plausibility to Poole’s claim, it suggests that we are most ourselves when we know that we will not be made to answer for what we are doing or saying; that the public self is a restrained and inhibited, and thus not authentic, version of the true self.  It tracks with the point of the Ring of Gyges story in Plato’s Republic.  The ring made the owner invisible, and so the argument went, revealed the true character of the owner (or better, the superficiality of virtue).

But if it is a plausible account of the human condition, it is also an incomplete one.  It is a Pyrrhic authenticity because it eliminates the possibility of appearing authentically before others who acknowledge our presence.  It thus suggests that we are most ourselves at the point at which it does not really matter what we are.  Now one may adopt a Simon & Garfunkle, “I am a rock, I am an island” attitude at this juncture and insist that they don’t need the acknowledgement and recognition, to say nothing of the love and care, that comes from human relationships; if so, then this post is probably not the place to argue otherwise.  I’m going to count on the fact that most of us will rather resonate with Hannah Arendt when she writes, “To live an entirely private life means above all to be deprived of things essential to a truly human life: to be deprived of the reality that comes from being seen and heard by others.”

What good is it to finally be myself, if I am myself alone? Granted there may be some extraordinary circumstances when, in fact, remaining authentically oneself simultaneously requires a great solitude. Ordinarily, however, we answer to both a need to cultivate the inner self with a measure of independence, and to find fulfillment in meaningful relationships with and among others.

Even I feel that over the course of these posts I have been trying to hit a moving target.  This reflects the complexity of lived human experience.  It is a complexity that is hard to account for online when platforms and interfaces seek to reduce that complexity to either the indiscriminately social or the misanthropically private. We are complex creatures and the great danger is that we will end up reducing our complexity to fit the constraints of life mediated through one platform or another.

The Fog of Life: “Google,” Memory, and Thought

Last week a study suggesting that the ability to Google information is making it less likely that we commit information to memory garnered a decent amount of attention and discussion, including a few of my own thoughts in my last post.  In addition to writing a post on the topic I did something I almost never do for the sake of sanity, I followed the comment thread on a few websites that had posted articles on the story.  That was an instructive experience and has led to a few observations, comments, and questions which I’ll list briefly.

  • Google functions as a synecdoche for the Internet in way that no other company does. So when questions like “Is Google Making Us Stupid?” or “Is Google Ruining Our Memory?” are posed, what is really meant is more like “Is the Internet Making Us Stupid?”, etc.
  • Of course, “Google” is not an autonomous agent, but it has generated and made plausible a certain rhetoric that rather imprudently dismisses the need to remember.
  • People still get agitated by claims that the Internet is either bad or good for you.  Stories are framed in this way in the media, and discussion assumes this binary form.  Not much by way of nuance.
  • On the specific question of memory in relation to this study and the subsequent discussion, it is never quite clear what sort of memory is in view, although it appears that memory for facts or some variation of that is what most people are assuming in their comments.
  • The computer model of the brain is alive and well in people’s imagination.  How else could we explain the recurring claim that by offloading our memory to “Google” we are “freeing up space” in our memory so that our “processing” runs more efficiently.
  • Does anyone really believe that we, members of present society, are generally in danger of reaching the limits of our capacity for memory?
  • There is a concern that tying up memory on the retention of “trivial” facts will hamper our ability to perform higher order tasks such as critical and creative thinking.
  • “Trivial” is relative.  Phone numbers are often given as an example, but while knowing some obscure detail about the human cardio-vascular system might be “trivial” to me, it wouldn’t be so to a cardiovascular surgeon in the midst of an operation.
  • Why are we opposing two forms of knowledge or “intelligence” anyway?  Aren’t most of the people who are able to think critically and creatively about a topic or discipline the same people who have attained a mastery of the details of that same topic or discipline? Isn’t remembering the foundation of knowing, or are not the two at least intimately related?
  • Realizing that total recall of all pertinent facts in most cases is too high a bar, wouldn’t it at least be helpful not to rhetorically oppose facts to thinking?
  • The denigration of memory for facts seems — be warned this is impressionistic — aligned with a slide toward an overarching cloud of vagueness settling over our experience.  Not simply the vagueness by comparison with print disciplined speech that accompanies a return to orality, but a vagueness, distractedness, or inattentiveness  about immediate experience in general.
  • Will we know nothing in particular because we know where to find everything in general?

On that last note, consider Elizabeth Spires’ poem, “A Memory of the Future,” published in The Atlantic, and make of it what you will:

I will say tree, not pine tree.
I will say flower, not forsythia.
I will see birds, many birds,
flying in four directions.

Then rock and cloud will be
lost. Spring will be lost.
And, most terribly,
your name will be lost.

I will revel in a world
no longer particular.
A world made vague,
as if by fog. But not fog.

Vaguely aware,
I will wander at will.
I will wade deeper
into wide water.

You’ll see me, there,
out by the horizon,
an old gray thing,
who finally knows

gray is the most beautiful color.

Offloaded Memory and Its Discontents (or, Why Life Isn’t a Game of Jeopardy)

It is not surprising to learn that we are taking the time to remember less and less given the ubiquitous presence of the Internet and the consequent ability to “Google it” when you need it whatever “it” happens to be.  A new study in the journal Science affirms what most of us have already witnessed or experienced.  According to one report on the study:

Columbia University psychologist Betsy Sparrow and her colleagues conducted a series of experiments, in which they found that people are less likely to remember information when they are aware of its availability on online search engines …

“Our brains rely on the Internet for memory in much the same way they rely on the memory of a friend, family member or co-worker. We remember less through knowing information itself than by knowing where the information can be found,” said Sparrow.

If you sense that something of significance is lost in the transition from internal memory to prosthetic memory, you may also find that it takes a little work to pin point and articulate that loss.  Nicholas Carr, commenting on the same study, also has reservations and he puts them this way:

If a fact stored externally were the same as a memory of that fact stored in our mind, then the loss of internal memory wouldn’t much matter. But external storage and biological memory are not the same thing. When we form, or “consolidate,” a personal memory, we also form associations between that memory and other memories that are unique to ourselves and also indispensable to the development of deep, conceptual knowledge. The associations, moreover, continue to change with time, as we learn more and experience more. As Emerson understood, the essence of personal memory is not the discrete facts or experiences we store in our mind but “the cohesion” which ties all those facts and experiences together. What is the self but the unique pattern of that cohesion?

I’ve articulated some questions about off-loaded memory and the formation of identity in the past few months as well, along with thoughts on the relationship between internal memory and creativity.  Here I want to draw on an observation by Owen Barfield in Saving the Appearances.  Barfield is writing about perception when he notes:

I do not perceive any thing with my sense-organs alone, but with a great part of my whole human being. Thus, I may say, loosely, that I ‘hear a thrush singing’. But in strict truth all that I ever merely ‘hear’ — all that I ever hear simply by virtue of having ears — is sound. When I ‘hear a thrush singing’, I am hearing, not with my ears alone, but with all sorts of other things like  mental habits, memory, imagination, feeling and (to the extent at least that the act of attention involves it) will.  Of a man who merely heard in the first sense, it could meaningfully be said that ‘having ears’ (i. e. not being deaf) ‘he heard not’.

Barfield reminds us that our perception of reality is never merely a function of our senses.  Our perception, which in some respects is to say our interpretation of reality into meaningful experience, is grounded in, among other things, our memory, and certainly not our offloaded memory.  Offloaded memory is not ready-to-hand for our minds to use in its remarkable work of meaning-making.  Perception in this sense is impoverished by our willingness to offload what we might otherwise have internalized.

All of this leads me to ask, What assumptions are at play that make it immediately plausible for so many to believe that we can move from internalized memory to externalized memory without remainder?  It would seem, at least, that the ground was prepared by an earlier reduction of knowledge to information or data.  Only when we view knowledge as the mere aggregation of discreet bits of data, can we then believe that it makes little difference whether that data is stored in the mind or in a database.

We seem to be approaching knowledge as if life were a game of Jeopardy which is played well by merely being able to access trivial knowledge at random.  What is lost is the associational dimension of knowledge which constructs meaning and understanding by relating one thing to another and not merely by aggregating data.  This form of knowledge, which we might call metaphorical or analogical, allows us to experience life with the ability to “understand in light of”, to perceive through a rich store of knowledge and experience that allows us to see and make connections which richly texture and layer our experience of reality.

Augustine famously described memory as a vast storehouse in which there are treasures innumerable. It is our experience of life that is enriched by drawing on the treasures deposited into the storehouse of memory.

____________________________________________________

Update:  Also see “The Extended Mind — How Google Affects Our Memories” and Johah Lehrer’s post on the study.