The Lifestream Stops

David Gelernter, 2013:

“And today, the most important function of the internet is to deliver the latest information, to tell us what’s happening right now. That’s why so many time-based structures have emerged in the cybersphere: to satisfy the need for the newest data. Whether tweet or timeline, all are time-ordered streams designed to tell you what’s new … But what happens if we merge all those blogs, feeds, chatstreams, and so forth? By adding together every timestream on the net — including the private lifestreams that are just beginning to emerge — into a single flood of data, we get the worldstream: a way to picture the cybersphere as a whole … What people really want is to tune in to information. Since many millions of separate lifestreams will exist in the cybersphere soon, our basic software will be the stream-browser: like today’s browsers, but designed to add, subtract, and navigate streams.”

E. M. Forster, 1909:

“Who is it?” she called. Her voice was irritable, for she had been interrupted often since the music began. She knew several thousand people, in certain directions human intercourse had advanced enormously.

But when she listened into the receiver, her white face wrinkled into smiles, and she said:

“Very well. Let us talk, I will isolate myself. I do not expect anything important will happen for the next five minutes-for I can give you fully five minutes …”

She touched the isolation knob, so that no one else could speak to her. Then she touched the lighting apparatus, and the little room was plunged into darkness.

“Be quick!” She called, her irritation returning. “Be quick, Kuno; here I am in the dark wasting my time.”

[Conversation ensues and comes to an abrupt close.]

Vashti’s next move was to turn off the isolation switch, and all the accumulations of the last three minutes burst upon her. The room was filled with the noise of bells, and speaking-tubes. What was the new food like? Could she recommend it? Has she had any ideas lately? Might one tell her one”s own ideas? Would she make an engagement to visit the public nurseries at an early date? – say this day month.

To most of these questions she replied with irritation – a growing quality in that accelerated age. She said that the new food was horrible. That she could not visit the public nurseries through press of engagements. That she had no ideas of her own but had just been told one-that four stars and three in the middle were like a man: she doubted there was much in it.

When I read Gelernter’s piece and his world-stream metaphor (illustration below), I was reminded of Forster’s story and the image of Vashti, sitting in her chair, immersed in cacophonic real-time stream of information. Of course, the one obvious difference between Gelernter’s and Forster’s conceptions of the relentless stream of information into which one plunges is the nature of the interface. In Forster’s story, “The Machine Stops,” the interface is anchored to a particular place. It is an armchair in a bare, dark room from which characters in his story rarely move. Gelernter assumes the mobile interfaces we’ve grown accustomed to over the last several years.

In Forster’s story, the great threat the Machine poses to its users is that of radical disembodiment. Bodies have atrophied, physicality is a burden, and all the ways in which the body comes to know the world have been overwhelmed by a perpetual feeding of the mind with ever more derivative “ideas.” This is a fascinating aspect of the story. Forster anticipates the insights of later philosophers such as Merleau-Ponty and Hubert Dreyfus as well as the many researchers helping us understand embodied cognition. Take this passage for example:

You know that we have lost the sense of space. We say “space is annihilated”, but we have annihilated not space, but the sense thereof. We have lost a part of ourselves. I determined to recover it, and I began by walking up and down the platform of the railway outside my room. Up and down, until I was tired, and so did recapture the meaning of “Near” and “Far”. “Near” is a place to which I can get quickly on my feet, not a place to which the train or the air-ship will take me quickly. “Far” is a place to which I cannot get quickly on my feet; the vomitory is “far”, though I could be there in thirty-eight seconds by summoning the train. Man is the measure. That was my first lesson. Man”s feet are the measure for distance, his hands are the measure for ownership, his body is the measure for all that is lovable and desirable and strong.

But how might Forster have conceived of his story if his interface had been mobile? Would his story still be a Cartesian nightmare? Or would he understand the danger to be posed to our sense of time rather than our sense of place? He might have worried not about the consequences of being anchored to one place, but rather being anchored to one time — a relentless, enduring present.

Were I Forster, however, I wouldn’t change his focus on the body. For isn’t our body and the physicality of lived experience that the body perceives also our most meaningful measure of time? Do not our memories etch themselves in our bodies? Does not a feel for the passing years emerge from the transformation of our bodies? Philosopher Merleau-Ponty spoke of the “time of the body.” Consider Shaun Gallagher’s exploration of Merleau-Ponty’s perspective:

“Temporality is in some way a ‘dimension of our being’ … More specifically, it is a dimension of our situated existence. Merleau-Ponty explains this along the lines of the Heideggerian analysis of being-in- the-world. It is in my everyday dealings with things that the horizon of the day gets defined: it is in ‘this moment I spend working, with, behind it, the horizon of the day that has elapsed, and in front of it, the evening and night – that I make contact with time, and learn to know its course’ …”

Gallagher goes on to cite the following passage from Merleau-Ponty:

“I do not form a mental picture of my day, it weighs upon me with all its weight, it is still there, and though I may not recall any detail of it, I have the impending power to do so, I still ‘have it in hand.’ . . . Our future is not made up exclusively of guesswork and daydreams. Ahead of what I see and perceive . . . my world is carried forward by lines of intentionality which trace out in advance at least the style of what is to come.”

Then Gallagher adds, “Thus, Merleau-Ponty suggests, I feel time on my shoulders and in my fatigued muscles; I get physically tired from my work; I see how much more I have to do. Time is measured out first of all in my embodied actions as I ‘reckon with an environment’ in which ‘I seek support in my tools, and am at my task rather than confronting it.'”

That last distinction between being at my task rather than confronting it seems particularly significant, especially as it involves the support of tools. Our sense of time, like our sense of place, is not an unchangeable given. It shifts and alters through technological mediation. Melvin Kranzberg, in the first of his six laws of technology, reminds us, “Technology is neither good nor bad; nor is it neutral.” Our technological mediation of space and time is never neutral; and while it may not be “bad” or “good” in some abstract sense, it can be more or less humane, more or less conducive to our well-being. If the future of the Internet is the worldstream, we should perhaps think twice before plunging.

Worldstream Gelernter

Borg Complex Case Files 2

UPDATE: See the Borg Complex primer here.

______________________________________

Alright, let’s review. A Borg Complex is exhibited by writers and pundits whenever you can sum up their message with the phrase: “Resistance is futile.”

The six previously identified symptoms of a Borg Complex are as follows:

1. Makes grandiose, but unsupported claims for technology

2. Uses the term Luddite a-historically and as a casual slur

3. Pays lip service to, but ultimately dismisses genuine concerns

4. Equates resistance or caution to reactionary nostalgia

5. Starkly and matter-of-factly frames the case for assimilation

6. Announces the bleak future for those who refuse to assimilate

In an effort to further refine our diagnostic instruments, I am now adding two more related symptoms to the list. The first of these arises from a couple of cases submitted by Nick Carr and is summarized as follows:

7. Expresses contemptuous disregard for the achievements of the past

Consider this claim by Tim O’Reilly highlighted by Carr:

“I don’t really give a shit if literary novels go away. … the novel as we know it today is only a 200-year-old construct. And now we’re getting new forms of entertainment, new forms of popular culture.”

Well, there you have it. This same statement serves to illustrate the second symptom I’m introducing:

8. Refers to historical parallels only to dismiss present concerns.

This symptom identifies historical precedents as a way of saying, “Look, people worried about stuff like this before and they survived, so there’s nothing to be concerned about now.” It sees all concerns through the lens of Chicken Little. The sky never falls. I would suggest that the Boy Who Cried Wolf is the better parable. The wolf sometimes shows up.

To this list of now eight symptoms, we might also add two Causes.

1. Self-interest, usually commercial (ranging from more tempered to crass)

2. Fatalistic commitment to technological determinism (in pessimistic and optimistic varieties)

Now let’s consider some more cases related to education, a field that is particularly susceptible to Borg Complex claims. In a recent column, Thomas Friedman gushed over MOOCs:

“Nothing has more potential to lift more people out of poverty — by providing them an affordable education to get a job or improve in the job they have. Nothing has more potential to unlock a billion more brains to solve the world’s biggest problems. And nothing has more potential to enable us to reimagine higher education than the massive open online course, or MOOC, platforms that are being developed by the likes of Stanford and the Massachusetts Institute of Technology and companies like Coursera and Udacity.”

These are high, indeed utopian hopes, and in their support Friedman offered two heartwarming anecdotes of MOOC success. In doing so, he committed the error of tech-uptopians (modeled on what Pascal took to be the error of Stoicism): Believing the wonderful use to which a technology can be put, will be the use to which it is always put.

Friedman wrapped up his post with the words of M.I.T. president, L. Rafael Reif, “There is a new world unfolding and everyone will have to adapt.”

And there it is — Borg Complex.

In long piece in The Awl (via Nathan Jurgenson), Maria Bustillos discusses an online exchange on the subject of MOOCs between Aaron Bady and Clay Shirky. Take the time to read the whole thing if you’re curious/worried about MOOCs. But here is Shirky on the difference between he and Bady: “Aaron and I agree about most of the diagnosis; we disagree about the prognosis. He thinks these trends are reversible, and I don’t; Udacity could go away next year and the damage is already done.”

Once again — Borg Complex.

At this juncture it’s worth asking, as Jurgenson did in sending me this last case, “Is he wrong?”

Is it not the case that folks who exhibit a Borg Complex often turn out to be right? Isn’t it inevitable that their vision will play out regardless of what we say or do?

My initial response is … maybe. As I’ve said before, a Borg Complex diagnosis is neutral as to the eventual veracity of the claims. It is more about an attitude toward technology in the present that renders the claims, if they are realized, instances of self-fulfilling prophecy.

The appearance of inevitability is a trick played by our tendency to make a neat story out of the past. Historians know this better than most. What looks like inevitability to us is only a function of zooming so far out that all contingencies fade from view. But in the fine-grain texture of lived experience, we know that there are countless contingencies that impinge upon the unfolding of history. It is also a function of forgetfulness. If only we had a catalog of all that we were told was inevitable that didn’t finally transpire. We only remember successes and then we tell their story as if they had to succeed all along, and this lends claims of inevitability an undeserved plausibility.

One other consideration: It is always worth asking, Who stands to profit from tales of technological inevitability?

Consider the conclusion of a keynote address delivered at the Digital-Life-Design Conference in Munich by Max Levchin:

“So to conclude: I believe that in the next decades we will see huge number of inherently analog processes captured digitally. Opportunities to build businesses that process this data and improve lives will abound.”

The “analog processes” Levchin is talking about are basically the ordinary aspects of our human lives that have yet to be successfully turned into digital data points. Here’s Carr on the vision Levchin lays out:

“This is the nightmare world of Big Data, where the moment-by-moment behavior of human beings — analog resources — is tracked by sensors and engineered by central authorities to create optimal statistical outcomes. We might dismiss it as a warped science fiction fantasy if it weren’t also the utopian dream of the Max Levchins of the world. They have lots of money and they smell even more: ‘I believe that in the next decades we will see huge number of inherently analog processes captured digitally. Opportunities to build businesses that process this data and improve lives will abound.’ It’s the ultimate win-win: you get filthy rich by purifying the tribe.”

Commenting on the same address and on Carr’s critique, Alan Jacobs captures the Borg-ish nature of Levchin’s rhetoric:

“And Levchin makes his case for these exciting new developments in a very common way: ‘This is going to add a huge amount of new kinds of risks. But as a species, we simply must take these risks, to continue advancing, to use all available resources to their maximum.’ Levchin doesn’t really spell out what the risks are — though Nick Carr does — because he doesn’t want us to think about them. He just wants us to think about the absolutely unquestionable ‘must’ of using ‘all available resources to their maximum.’ That ‘advance’ in that direction might amount to ‘retreat’ in actual human flourishing does not cross his mind, because it cannot. Efficiency in the exploitation of ‘resources’ is his only canon.”

We “must” do this. It is inevitable but that this will happen. Resistance is futile.

Except that very often it isn’t. There are choices to be made. A figure no less identified with technological determinism than Marshall McLuhan reminded us: “There is absolutely no inevitability as long as there is a willingness to contemplate what is happening.” The handwaving rhetoric that I’ve called a Borg Complex is resolutely opposed to just such contemplation, and very often for the worst of motives. By naming it my hope is that it can be more readily identified, weighed, and rejected.

Elsewhere McLuhan said, “the only alternative is to understand everything that is going on, and then neutralize it as much as possible, turn off as many buttons as you can, and frustrate them as much as you can … Anything I talk about is almost certainly to be something I’m resolutely against, and it seems to me the best way of opposing it is to understand it, and then you know where to turn off the button.”

Understanding is a form of resistance. So remember, carry on with the work of intelligent, loving resistance where discernment and wisdom deem it necessary.

_______________________________________________

This is a third in series of Borg Complex posts, you can read previous installments here.

I’ve set up a tumblr to catalog instances of Borg Complex rhetoric here. Submissions welcome.

Borg

For Your Consideration – 10

These “For Your Consideration” posts have been few and far between of late. The pace may pick up to something close to weekly, which was my original intent. Or, it may not. We’ll see. If only writing this blog were my full-time occupation. Alas, it is not, barring the unexpected appearance of some foolishly generous patron. That said, I have found it fairly easy to post about a link a day at the Facebook page I set up for The Frailest Thing. So if you’d like a more regular stream of suggested readings related to technology and society feel free to “Like” the page (that phrase persists in sounding rather ridiculous) by clicking the icon just to the right of this text.  I’ll only note that the links there will be more narrowly focused on matters technological while those I include in these posts tend to be a bit more eclectic. So without further ado …

“Synthetic double-helix faithfully stores Shakespeare’s sonnets”:

“DNA packs information into much less space than other media. For example, CERN, the European particle-physics lab near Geneva, currently stores around 90 petabytes of data on some 100 tape drives. Goldman’s method could fit all of those data into 41 grams of DNA.”

“Noted”:

“The remedy for the problems created by information technology is more information technology.”

“In his 1689 De arte Excerpendi, the Hamburg rhetorician Vincent Placcius described a scrinium literatum, or literary cabinet, whose multiple doors held 3,000 hooks on which loose slips could be organized under various headings and transposed as necessary.2 Two of the cabinets were eventually built, one for Placcius’s own use and one acquired by Leibniz.”

“Google and the future of search: Amit Singhal and the Knowledge Graph”:

“‘We are maniacally focusing on the user to reduce every possible friction point between them, their thoughts and the information they want to find.’ Getting ever closer to Page’s brain implants, in effect.”

“The Pope’s Social Media Guru On @Pontifex’s First Tweet”:

“As Secretary of the Pontifical Council for Social Communications, Tighe is the Pope’s social media guy.”

“It’s almost like the equivalent of the old marketplaces where Jesus went to engage people. That’s where we have to be, with all of its ambiguities and difficulties, because that’s where the people are.”

“Why We Should Memorize Poetry”:

“My late colleague Joseph Brodsky, who died in 1996, used to appall his students by requiring them to memorize something like a thousand lines each semester. He felt he was preparing them for the future; they might need such verses later in life.”

‘I am what I am attached to’: On Bruno Latour’s ‘Inquiry into the Modes of Existence’:

“The Economy — the pride and joy of the Moderns and of the “hard” social sciences — illustrates this well. What a mad construction Latour shows it to be! It is Providence itself, a second Nature, a religion that presides over the distribution of all that is good and evil.”

“Speak, Memory”:

“It is startling to realize that some of our most cherished memories may never have happened—or may have happened to someone else. I suspect that many of my enthusiasms and impulses, which seem entirely my own, have arisen from others’ suggestions, which have powerfully influenced me, consciously or unconsciously, and then been forgotten.”

“SIRI RISING: The Inside Story Of Siri’s Origins — And Why She Could Overshadow The iPhone”:0

“This Siri — the Siri of the past — offers a glimpse at what the Siri of the future may provide, and a blueprint for how a growing wave of artificially intelligent assistants will slot into our lives. The goal is a human-enhancing and potentially indispensable assistant that could supplement the limitations of our minds and free us from mundane and tedious tasks.”

The Internet as seen from 1969:

Illuminating Contrasts

Patrick Leigh Fermor was widely regarded to be the best travel writer of the last century. He lived what was by all accounts a remarkably full life that included, to mention just two of the more striking episodes, a dalliance with a Romanian princess and the successful kidnapping of a German general in occupied Crete. Leigh Fermor struck up a lasting friendship with that same general when, as if his life were an Errol Flynn film, he completed a line from Horace in Latin begun by the German in the midst of his capture.

Among Leigh Fermor’s storied travels, there was a period of several months in 1957 spent as a guest in a series of monasteries while he worked on his writing. He tells of his time in the monasteries in a small book, considered by many to be his best, A Time to Keep Silence.

Leigh Fermor began with a visit to the Abbey of St. Wandrille de Fontanelle in the northwest of France. While there, he experienced a profound recalibration of the rhythm and pace of life. Of his first days he writes:

“My first feelings in the monastery changed: I lost the sensation of circumambient and impending death, of being by mistake locked up in a catacomb. I think the alteration must have taken about four days. The mood of dereliction persisted some time, a feeling of loneliness and flatness that always accompanies the transition from urban excess to a life of rustic solitude. Here in the Abbey, in absolutely unfamiliar surroundings, this miserable bridge-passage was immensely widened.”

We have vague ideas about the monastic life, when we think of it at all, and we know that it must be quite different from our own, but, Leigh Fermor insists, “only by living for a while in a monastery can one quite grasp its staggering difference from the ordinary life that we lead.”

In his view, “The two ways of life do not share a single attribute; and the thoughts, ambitions, sounds, light, time and mood that surround the inhabitants of a cloister are not only unlike anything to which one is accustomed, but in some curious way, seem its exact reverse. The period during which normal standards recede and the strange new world becomes reality is slow, and, at first, acutely painful.”

“To begin with,” he continues, “I slept badly at night and fell asleep during the day, felt restless alone in my cell and depressed by the lack of alcohol, the disappearance of which had caused a sudden halt in the customary monsoon.”

He then goes on to explain a remarkable alteration in his circadian rhythm. It’s worth quoting him at length at this point. Consider this carefully:

“The most remarkable preliminary symptoms were the variations of my need of sleep. After initial spells of insomnia, nightmare and falling asleep by day, I found that my capacity for sleep was becoming more and more remarkable: till the hours I spent in or on my bed vastly outnumbered the hours I spent awake; and my sleep was so profound that I might have been under the influence of some hypnotic drug. For two days, meals and the offices in the church — Mass, Vespers and Compline — were almost my only lucid moments. Then began an extraordinary transformation: this extreme lassitude dwindled to nothing; night shrank to five hours of light, dreamless and perfect sleep, followed by awakenings full of energy and limpid freshness. The explanation is simple enough: the desire for talk, movements and nervous expression that I had transported from Paris found, in this silent place, no response or foil, evoked no single echo; after miserably gesticulating for a while in a vacuum, it languished and finally died for lack of any stimulus or nourishment. Then the tremendous accumulation of tiredness, which must be the common property of all our contemporaries, broke loose and swamped everything. No demands, once I had emerged from that flood of sleep, were made upon my nervous energy: there were no automatic drains, such as conversation at meals, small talk, catching trains, or the hundred anxious trivialities that poison everyday life. Even the major causes of guilt and anxiety had slid away into some distant limbo and not only failed to emerge in the small hours as tormentors but appeared to have lost their dragonish validity.”

Reading Leigh Fermor’s account of his transition from the rhythms of ordinary life to the rhythms of monastic life reminded me of the testimonials one hears every so often from those who have undertaken a digital fast, a technology Sabbath, or a digital detox. There is, after all, something rather ascetic about most of that terminology. I was especially reminded of a 2010 story about five neuroscientists that made an experiment of a week-long rafting trip in Utah. David Strayer, one of the scientists in the party, coined the term “third-day syndrome” to describe  the shifts in attitude and behavior that begin to manifest themselves after three days of being “unplugged” — quite near the time Leigh Fermor suggests it took him to acclimate to life in the monastery, about four days.

One of the neuroscientists added, “Time is slowing down.” Another wondered out loud: “If we can find out that people are walking around fatigued and not realizing their cognitive potential … What can we do to get us back to our full potential?”

Leigh Fermor put this point more eloquently when he described “the tremendous accumulation of tiredness, which must be the common property of all our contemporaries.” Remember, he was writing in the late 1950s.

Whatever we think of the peculiar forms of fatigue that might be associated with the character of “digital life,” Leigh Fermor’s account of his journey into the silence of the monastic houses more than half a century ago reminds us that, if our bodies and minds are to be trusted, there has been something amiss with the way we order our lives since long before the advent of the Internet and smartphones.

I write that with some trepidation. Too often the identification of some historical precedent or antecedent to a modern malaise leads some to conclude something like, “Well, you see people in the past have felt this too and they survived, so this must not really be a problem.” There ought to be a name for that sort of fallacy. Perhaps there is one and I’m unaware of it. I should, in fact, list the deployment of this line of reasoning as a symptoms of a Borg Complex. In any case, it is shallow solace.

I rather think that such comparisons point us not to non-problems, but to perennial problems that take historically specific form and that each generation must address for itself. Leigh Fermor’s monastic retreat provided a ground against which the figure of mid-twentieth century urban life came into sharper relief.

Spans of time spent materially disconnected from the Internet may serve a similar function today. They may heighten our understanding of what is now the character of our ordinary life. It is its ordinariness, after all, that keeps us from seeing it clearly. Without a contrasting ground the figure does not appear at all, and the contrast may point out to us where we’ve been taken into patterns and habits that work against our well-being.

Writing, Academic and Otherwise

Passing through the process of academic professionalization is, in part, not unlike the process of learning a new language. It is, for example, the sort of process that might lead me to write discourse in place of language to conclude that first sentence. Learning this new language can be both an infuriating and exhilarating experience. At first, the new language mystifies, baffles, and frustrates; later, if one sticks to it and if this new language is not utter nonsense (as it may sometimes be), there is a certain thrill in being able to see and name previously unseen (because unnamed) and poorly understood dimensions of experience.

I suspect the younger one happens to be when this initiating process takes place, the more zealously one may take to this new language, allowing it to become the grid through which all experience is later comprehended. This is, on the whole, an unfortunate tendency. Another unfortunate tendency is that by which, over time, academics forget that theirs is a learned and often obscure language which they acquired only after months and possibly years of training. This is easily forgotten, perhaps because it is only metaphorically a new language. It is, if you are American, still English, but a peculiarly augmented (or deformed, depending on your perspective) form of the language.

This means, usually, that when academics (or academics in training) write, they write in a way that might not be easily assimilated by non-academics. This is, of course, entirely unrelated to intellect or ability (a point that is sometimes missed). A brilliant Spaniard, for instance, is no less brilliant for having never taken the time to learn Swahili. This is also a function of the tribal quality of academic life. One gets used to operating in the language of the tribe and sometimes forgets to adjust to accordingly when operating in other contexts.

Again, I think this is very often simply a matter of habit and forgetfulness, although, it is sometimes a matter of arrogance, self-importance, and other such traits of character.

I mention all of this because, if I were asked to verbalize why I write this blog, I would say that it was in part to translate the work of academics, critics, and theorists into a more accessible form so that their insights regarding the meaning and consequences of media and technology, so far as those insights were useful, might be more widely known. After all, the technologies I usually write about affect so many of us, academics and non-academics alike. Anyone who cares to think about how to navigate these technologies as wisely as possible should be able to encounter the best thinking on such matters in a reasonably accessible form. I don’t know, maybe there is a certain naïveté in that aspiration, but it seems worth pursuing.

I’m fairly certain, though, that I don’t always achieve this goal that I half-consciously maintain for what I do here. I’m writing this post mostly to remind myself of this aspiration and renew my commitment to it.

I should be clear, I’m talking neither about dumbing down what there is to know nor am I suggesting anything like condescension ought to be involved. The challenge is to maintain the depth of insight and to resist the over-simplification of complexity while at the same time avoiding the characteristics of academic language that tend to make it inaccessible. It’s a matter of not ignoring the non-academic reader while also taking them seriously.

I’m reminded of some comments that David Foster Wallace made regarding the purposes of literature. I’ve cited this passage before, quite some time ago, and it has stuck with me. It’s a bit long, but worth reading. Wallace is discussing literature with the interviewer, David Lipsky, and they are debating the relative merits of traditional literature and less traditional, more avant-garde writing:

Huh.  Well you and I just disagree.  Maybe the world just feels differently to us.  This is all going back to something that isn’t really clear:  that avant-garde stuff is hard to read.  I’m not defending it, I’m saying that stuff — this is gonna get very abstract — but there’s a certain set of magical stuff that fiction can do for us.  There’s maybe thirteen things, of which who even knows which ones we can talk about.  But one of them has to do with the sense of, the sense of capturing, capturing what the world feels like to us, in the sort of way that I think that a reader can tell “Another sensibility like mine exists.”  Something else feels this way to someone else.  So that the reader feels less lonely.

There’s really really shitty avant-garde, that’s coy and hard for its own sake.  That I don’t think it’s a big accident that a lot of what, if you look at the history of fiction — sort of, like, if you look at the history of painting after the development of the photograph — that the history of fiction represents this continuing struggle to allow fiction to continue to do that magical stuff.  As the texture, as the cognitive texture, of our lives changes.  And as, um, as the different media by which our lives are represented change.  And it’s the avant-garde or experimental stuff that has the chance to move the stuff along.  And that’s what’s precious about it.

And the reason why I’m angry at how shitty most of it is, and how much it ignores the reader, is that I think it’s very very very very precious.  Because it’s the stuff that’s about what it feels like to live.  Instead of being a relief from what it feels like to live.

Maybe it is ill-advised to make this comparison, but I think what Wallace has to say here, or at least the spirit of what he is saying can apply to academic work as well. It can also be a way of representing what it feels like to be alive. I tend to hold literature in rather high esteem, so I don’t think that non-fiction can really replicate the experience of more literary writing, but it can be useful in its own way. It can help make sense of experience. It can generate self-understanding. It can suggest new possibilities for how to make one’s way in the world.

It’s too late for new year’s resolutions, but I’m hoping to keep this goal more clearly in focus as I continue to write on here. You can tell me, if you’re so inclined, how well I manage. Cheers.