Salon has an interview up with David Weinberger, a scholar at Harvard’s Berkman Center for the Internet and Society and the author of To Big to Know. Weinberger believes that the Internet is revolutionizing knowledge by creating the “networked fact”:
“Over the past couple hundred years, we’ve had this idea that knowledge is composed of facts about the world, and together we are engaged in this multigenerational enterprise of gathering facts and posting them, and ultimately we’ll have a complete picture of the world. That view of facts as the irreducible atoms of knowledge has some benefit, but we’re seeing a different type of fact emerge on the Net as well. Traditional facts are still there. Facts are facts. But we’re seeing organizations of all sorts releasing their data, their facts, onto the Web as huge clouds of triples [another word for linked data]. They’re a connection of two ideas through some relationship — that’s why they’re called triples — but not only can they be linked together by computers, they themselves consist of links. Each of the elements of a linked atom is a pointer to some resource that disambiguates it and explains what it is.”
Okay, got that? Underwhelmed? You must not be understanding the import of the shift from “traditional facts” to “triples”. Let’s try this again with a concrete illustration:
“OK, so, if the triple is “Edmonton is in Canada,” ideally each of those should link to some other spot on the Web that explains exactly which Edmonton, because there’s probably more than one, along with which Canada (though there’s probably only one). And “is in” is a very ambiguous statement [Clinton nods], so you would point to some vocabulary that defines it for geography. Each of these little facts is designed not only to be linked up by computers, but it itself consists of links. It’s a very different idea than that facts are bricks that lay a firm foundation. The old metaphor for knowledge was architectural and archaeological: foundations, bricks. Now we have clouds.”
Okay, got it? Now we have clouds! Clouds … they’re here. We have them, now.
I really thought I was missing something until I came across Evgeny Morozov’s review of Weinberger’s book. Morozov is also unimpressed. He reminds us that Weinberger’s claims are not exactly original. Lyotard already made similar claims in his 1979 book, The Postmodern Condition. I would add that among the easier to digest elements in the work of the late Gilles Deleuze was the idea that knowledge was structured like a rhizome rather than a tree. (It is unclear whether he ever said, “Now we have rhizomes.”)
Moreover, Morozov is enough of a stickler for traditional facts to point out that what we mean by knowledge depends a great deal on our epistemic context. In his view, Weinberger’s thesis falters because he speaks of knowledge and facts as abstractions and fails to distinguish between contexts in which the truthfulness of “knowledge” counts and contexts in which it does not.
Judging from certain comments in the interview, Morozov seems on target when he claims that “Weinberger wants to be the Marshall McLuhan of knowledge management.” Here is Weinberger on knowledge and its medium:
“With the new medium of knowledge — the Internet — knowledge not only takes on properties of that medium but also lives at the level of the network. So rather than simply trying to cultivate smart people, we also need to be looking above the level of the individual to the network in which he or she is embedded to see where knowledge lives.”
It’s somewhat unfair to ask for too much depth in an interview format, but I’d be hard pressed to unpack anything meaningful out of those sentences. When McLuhan comes off as vague and gnomic you have the sense that you’re not getting something deep; in this case I have the sense that there is not much to get. A little later on Weinberger offers this further reflection:
In 1988, Russell Ackoff, an organizational theorist, proposed a pyramid that has become really standard in many business environments. You have data at the bottom, then information, and then knowledge — and then at the top, wisdom, as if wisdom is the reduced set of knowledge. The idea is in line with our traditional idea of knowledge, which is based on the idea that there’s too much to know, there’s more than can fit into any skull, so we need to come up with strategies to deal with it. And that pyramid is the information age’s elaboration of this. In every step you get quality and value by reducing what was at lower steps, but we’ve had a reductive sense of knowledge for about 2,500 years.
Again, after reading that a time or two, I’m still not sure what the point is. But I do know that Ackoff did not come up with that schema. In his mid-1930s composition, “Choruses from ‘The Rock'”, TS Eliot wrote,
“Where is the Life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?”
Now that makes sense.
One last point, it is not a reduction that is entailed by the movement from data to information to knowledge and then wisdom. It is an enhancement based on the progressive derivation and application of meaning. This is a very different activity than the gradual reduction of a vast field of data to a more manageable set and it is an activity that cannot be abstracted to some nebulous realm above individuals. It lives concretely in embodied and embedded experience.
Update: Perhaps the interview format did not suit Mr. Weinberger. In fairness, here is an edited excerpt from the book at The Atlantic that at least has the virtue of making sense: “To Know, But Not Understand.”
“If the idea of progress has the curious effect of weakening the inclination to make intelligent provision for the future, nostalgia, its ideological twin, undermines the ability to make intelligent use of the past.” — Christopher Lasch, The True and Only Heaven
Memory and nostalgia have been twin recurring themes running through various posts this past year. Both of these words name ways of being with the past. Memory generally names what we take to be a healthy ordering of our relationship to the past, while nostalgia names whatever is generally taken to be a disordered form of relating to the past. I’ve long sensed, however, without necessarily having thought much about it, that some of what is casually and dismissively labeled nostalgia may in fact belong under the category of healthy remembering or, alternatively, that the nostalgic longings in question at the very least signaled a deeper disorder of which nostalgia is but a symptom.
If we step back and look at some of our behaviors and certain trends that animate popular culture, we might conclude that we are in the thralls of some sort of madness with regards to time and the past. We obsessively document ourselves, visually and textually, creating massive archives stuffed with memory, virtual theaters of memory for our own lives. Facebook’s evolving architecture reflects this obsession as it now aims to make it easier to chronologically and (geo)spatially order (a database gesture toward narrative) and access our stored memories. Simultaneously, vintage and retro remain the stylistic order of the day. Hyperrealistic period dramas populate our entertainment options. T-shirt design embraces the logos and artifacts of the pre-digital past. Social critics suggest that we are aesthetically stuck, like a vinyl record (which are incidentally quite hip again) skipping incessantly.
What do we make of it? How do we understand all of these gestures, some of them feverish, toward remembering and the past? How can we discern where memory ends and nostalgia begins? For that matter, how do we even define nostalgia?
Christopher Lasch, who raised many of these same sorts of questions throughout his career, particularly in The True and Only Heaven, provides some helpful insights and categories to help us along the path to understanding. But before considering Lasch’s perspective, let me take just one more pass at clarifying the main issues that interest me here.
Approaching nostalgia we need to distinguish between the semantic and the ontological dimensions of the issue. The semantic questions revolve around the use of the word nostalgia; the ontological questions revolve around the status of the sensibilities to which the word is applied as well as their sources and roots. It would seem that the semantic question has been more or less resolved so that the connotations of the word are uniformly negative (more on this later). Nostalgia, in other words, is typically a word of opprobrium. This being the case, then, the question becomes whether or not the word is justly applied and such judgments require us to define what constitutes healthy and unhealthy, ordered and disordered modes of relating to the past. Coming back to Lasch, we can see what help he offers in thinking through these questions.
First, for Lasch, nostalgia carries entirely negative connotations. He employs the term to name disordered relationships to the past. So, in his view, nostalgia prevents us from making intelligent use of the past because it is an ahistorical phenomenon.
“Strictly speaking, nostalgia, does not entail the exercise of memory at all, since the past it idealizes stands outside time, frozen in unchaining perfection. Memory too may idealize the past, but not in order to condemn the present. It draws hope and comfort from the past in order to enrich the present and to face what comes with good cheer. It sees past, present, and future as continuous. It is less concerned with loss than with our continuing indebtedness to a past the formative influence of which lives on in our patterns of speech, our gestures, our standards of honor, our expectations, our basic disposition toward the world around us.”
In this paragraph, by contrasting it with memory, Lasch lays out the contours of nostalgia as he understands it:
a. Nostalgia is primarily interested in condemning the present.
b. It fails to offer hope or otherwise enrich the present.
c. It sunders the continuity of past, present, and future.
d. It is focused on loss.
e. It fails to recognize the ongoing significance of the past in the present.
Boucher, An Autumn Pastoral, 1749
Lasch goes on to offer a genealogy of the various sources of contemporary nostalgia beginning with the historicizing of the pastoral sensibility and proceeding through the Romantic idealization of childhood, America’s romanticization of the West and later the small town, and finally nostalgia’s coming into self-awareness as such in the 1920s.
The recurring theme in these earlier iterations of the nostalgic sensibility is the manner in which, with the exception of childhood, an initially spatial displacement — of the countryside for example — becomes temporalized. So, for instance, the long standing contrast between town and country that animated pastoral poetry since the classical age became, in the eighteenth and nineteenth century with the advent of industrialization, a matter not of here and there, but then and now. The First World War had a similar effect on the trope of childhood’s lost innocence by marking off the whole of history before the war as a time of relative innocence compared with the generalized loss of innocence that characterized the time after the war.
The First World War, in Lasch’s telling of nostalgia’s history, also gave a specific form to a tendency that first appears in the early nineteenth century: “In an ‘age of change,’ as John Stuart Mill called it in his 1831 essay ‘The Spirit of the Age,’ the ‘idea of comparing one’s own age with former ages’ had for the first time become an inescapable mental habit; Mill referred to it as the ‘dominant idea’ of the nineteenth century.”
It would seem to me that this tendency is not entirely novel in Mill’s day, after all Renaissance culture made much of its contrast with the so-called Dark Ages and its recovery of classical civilization. But it seems safe to credit Mill’s estimation that in his day it becomes for the first time “an inescapable mental habit.” This would seem to correspond roughly with the emergence of historical consciousness and the discipline of history in its modern form — which is to say as a “science” of the past rather than as a branch of moral philosophy.
Following the First World War, this comparative impulse took on a specific form focused on the generation as the preferred unit of analysis. First, Lasch writes, “For those who lived through the cataclysm of the First World War, disillusionment was a collective experience — not just a function of the passage from youth to adulthood but of historical events that made the prewar world appear innocent and remote.” He then notes that it was no surprise that “the concept of the generation first began to influence historical and sociological consciousness in the same decade, the twenties, in which people began to speak so widely of nostalgia.”
It was in the 1920s, according to Lasch that nostalgia became aware of itself. In other words, it was not until the 1920s that the semantic problem we noted earlier appears since it was not until then that the term nostalgia gets applied widely to the varieties of responses to loss that had long been expressed in literature and the visual arts. Prior to the 1920s, nostalgia was mostly a medical term linked to the psychosomatic symptoms associated with severe, literal homesickness.
According to Lasch, by the mid-twentieth century, “History had come to be seen as a succession of decades and also as a succession of generations, each replacing the last at approximately ten year intervals. This way of thinking about the past had the effect of reducing history to fluctuations in public taste, to a progression of cultural fashions in which the daring advances achieved by one generation become the accepted norms of the next, only to be discarded in their turn by a new set of styles.”
This seems just about right. You can test it on yourself. First, consider our habit of talking about generations: baby-boomers, Y, X, millennials. Then, think back through the twentieth century. How is your memory of the period organized. I’m willing to bet that yours, as mine, is neatly divided up into decades even when the decades are little more than arbitrary with regards to historical development. And, further reinforcing Lasch’s point, what is the first decade for which you have a ready label and set of associations? I’m again willing to bet it is the 1920s, the “Roaring Twenties” of flappers, jazz, F. Scott Fitzgerald, and the stock market crash. Thirties: depression. Forties: World War II. Fifties: Ike and Beaver. Sixties: sex, drugs, and rock ‘n’ roll. Seventies: Nixon and disco. Eighties: big hair and yuppies. And so on.
(Interestingly, my sense is that after the 1990s we seem to have a harder time with the decade scheme, perhaps because we are still so very fresh. I wonder, though, if we will have as neat a caricature of the first and subsequent decades of the twenty-first century as we do for their immediate predecessors.)
But this manner of thinking evidences the chief problem Lasch identifies with nostalgia. It has the effect of hermetically sealing off the past from the present. It represents the past as a series of discreet eras that, once superseded inevitably and on schedule by the next, cease to effect the present. Moreover, “Once nostalgia became conscious of itself, the term rapidly entered the vocabulary of political abuse.” For a society still officially allied to a progressivist ideology (as in Progress, not necessarily progressive politics), the charge of nostalgia “had attained the status of a political offense of the first order.” And here again is the semantic problem. When a word becomes a lazy term of abuse, then it is in danger of swallowing up all sorts of realities that for whatever reason do not sit well with the person doing the labeling.
So as Lasch begins to draw his social history of nostalgia to a close with the 1960s, “denunciations of nostalgia had become a ritual, performed, like all rituals, with a minimum of critical reflection.” In his 1965 The Paranoid Style in American Politics, for example, Richard Hofstadter “referred repeatedly to the ‘nostalgia’ of the American right and the of the populist tradition from which it supposedly derived.” And yet, the “nostalgia wave of the seventies” was still ahead:
“Time, Newsweek, US News and World Report, Saturday Review, Cosmopolitan, Good Housekeeping, Ladies’ Home Journal, and the New Yorker all published reports on the ‘great nostalgia kick.’ ‘How much nostalgia can America take?’ asked Time in 1971. The British journalist Michael Wood, citing the revival of the popular music of the fifties, the commercial appeal of movies about World War II, and the saturation of the airwaves with historical dramas — ‘Upstairs, Downstairs,’ ‘The Pallisers,’ ‘The Forsyte Saga’ — declared, ‘The disease, if it is a disease, has suddenly become universal.'”
Maggie Smith in Downton Abbey
Note for just a moment how easy it would be to update the British journalist’s comments to fit contemporary circumstances. Just add Hipster Revivalism and replace the television dramas with Mad Men, Boardwalk Empire, Downton Abby, and Pan Am. (See also Chuck Klosterman and Kurt Anderson.) More on this in just a moment, but first back to Lasch.
Lasch concludes his analysis of memory and nostalgia by elaborating on the idea that “Nostalgia evokes the past only to bury it alive.” From this perspective, nostalgia and the ideology of Progress have a great deal in common. They both evince “an eagerness to proclaim the death of the past and to deny history’s hold over the present … Both find it difficult to believe that history still haunts our enlightened, disillusioned maturity.”
He goes on to add that the “nostalgic attitude” and belief in progress also share “a tendency to represent the past as static and unchaining, in contrast to the dynamism of modern life … Notwithstanding its insistence on unending change, the idea of progress makes rapid social change appear to be uniquely a feature of modern life. (The resulting dislocations are then cited as an explanation of modern nostalgia.)”
Regarding that last parenthetical statement, I plead guilty as charged. I’m not sure, however, that this is entirely off the mark, particularly when we distinguish between semantic (we might even say rhetorical) matters and the underlying phenomenon. Lasch himself points to the connections between industrialization, urbanization, and the First World War and the history of the nostalgic sensibility. The psychic consequences of these phenomena were not illusory. What Lasch is concerned about, however, is the manner in which these psychic consequences were ultimately interpreted and filtered through the language of nostalgia.
The danger, in his view, is that we fail to reckon with the persistence of history. By way of contrast, Lasch offers us Anthony Brandt’s comments on historical memory and nostalgia. Lasch summarizes Brandt’s reflections on Henry Ford’s Greenfield Village, colonial Williamsburg, and Disney’s “Main Street” this way: “the passion for ‘historical authenticity’ seeks to recapture everything except the one thing that matters, the influence of the past on the present.” Real knowledge of the past, in Brandt’s view, “requires something more than knowing how people used to make candles or what kind of bed they slept in. It requires a sense of the persistence of the past: the manifold ways in which it penetrates our lives.”
Disney’s Main Street
This is Lasch’s chief concern and he is certainly right about it. If we define nostalgia as a dehistoricizing impulse that undermines our ability to think about the past and its enduring consequences, then it is certainly to be resisted. Lasch is advocating a way of being with the past that takes it seriously by refusing to romanticize it and by recognizing its continuing influence on the present. In this sense, he is advancing a posture similar to that which I attempted stake out in a review of Woody Allen’s meditation on nostalgia, Midnight in Paris: we are not to live in the past, but we are to live with it. It is a position that is neatly summed up by Faulkner’s line, “The past is never dead. It’s not even past.”
But how does Lasch’s analysis help us understand the contemporary burst of nostalgia? For one thing, it reminds us that such bursts have a history. Ours is not necessarily novel. However, recognizing that a variety of sensibilities and responses are grouped, sometimes indiscriminately, under the heading of nostalgia, it is worth asking how our present fixations depart from their antecedents.
I’m going to venture this schema in response. There appear to be two prior large scale waves of sensibilities that have subsequently been identified as nostalgic, the first running throughout the middle part of the nineteenth century and another materializing subsequent to the First World War. These waves, at the expense of mixing metaphors, also yielded subsequent ripples of nostalgia that shared their essential orientation. The former wave and its ripples appears to have been generated by spatial or physical displacements related to industrialization and urbanization. The latter appears to have been generated by a temporal dislocation occasioned by the First World War that created a psychic chronological rupture. The semantic history of the nostalgia tracks with this two step generalization. Places were idealized, and then times.
So what are we idealizing today? I’m suggesting that it is neither a place nor a time (even though it is necessarily related to the chronological past). Contemporary nostalgia is fixated on the materiality of the past. Take Mad Men, for example, it’s not for the time period that we are nostalgic, the series after all gives us a rather bleak view of the era. No, it is for the stuff of the era that we are nostalgic — the fedoras, the martini glasses, the furniture, the typewriters, the ordinary accouterments of daily life. Consider all of those lingering close-ups on the objects of the past that are characteristic of the early seasons. Remember too how often such shows are praised for their “attention to detail” which is to say for the way they capture the material conditions of the era.
The same holds true for the hipster revivalism linked above. It is focused on the equipment of the past, not its values or its places. This is why Pottery Barn offers a faux rotary phone. It’s why vinyl records are now on sale again at big box retailers like Best Buy and Target. Our nostalgia is neither spatial nor temporal, it is tactile. And it is a response — conscious or not, ill-advised as it may be — to digital/virtual culture.
Taking one last cue from Lasch, nostalgia for the material risks missing the ways in which the material persists in its significance in much the same way that nostalgia for the past misses the way the past persists into the present. The danger is that we begin to think about life in terms of immaterial abstractions like “the cloud” and “Information” or false dichotomies such as the “online/offline” distinction while ignoring the underlying, persistently material realities. It also threatens to distract us from the persistence (and significance) of technologies that do not fit neatly in the digital category. This same material nostalgia which is blind to the materiality of the present is what leads us to myopically and misleadingly focus analysis of contemporary events on abstractions such as the “Twitter Revolution” or “social media campaigns.” It is not that these are insignificant, it is rather that the rhetoric obscures the ongoing significance of the material realities. It fashions a false dichotomy between the virtual present and the material past. And our thinking will be all the worse for it.
To rephrase Lasch, tactile nostalgia undermines our ability to make intelligent use of the material.
_________________________________________________
Update: Another instance of tactile nostalgia — The Book Club.
Early in the life of this blog, I linked to a very useful post by Adam Thierer at Technology Liberation Front mapping out a spectrum of attitudes to the Internet ranging from optimism to pessimism with a pragmatic middle in between. The post helpfully positioned a wide variety of contemporary writers and summarized their positions on the social consequences of the Internet. It remains a great starting point for anyone wanting to get their bearings on the public debate over the Internet and its consequences.
Adam subsequently included a link to my post on technology Sabbaths in his list of resources for further reading and he has since then, and on more than one occasion, been generous enough to kindly mention this blog via Twitter. He’s now writing regularly at Forbes and offers excellent commentary on the legal and regulatory issues related to Internet policy.
On the aforementioned spectrum, I believe that Adam positioned himself on the optimistic side of the pragmatic middle. I’ve generally been content to occupy the more pessimistic side. Precisely because of this propensity, I make a point of reading folks like Adam for balance and perspective. I was not surprised then to read Adam’s recent upbeat article, “10 Things Our Kids Will Never Worry About Thanks to the Information Revolution.”
I trust he won’t mind if I offer a view from just the other side of the pragmatic middle. This will work best if you’ve read his post; so click through, give it a read, and then come back to consider my take below.
So here are my more contrarian/pessimistic assessments. The bold faced numbered items are Adam’s list of things kids will never worry about thanks to digital technology. My thoughts are below each.
1. Taking a typing class.
I took a typing class in ninth grade and as much as I disliked it at the time, I’m extremely grateful for it now. It made college and grad school much less arduous, and has served me well given the countless professional uses of typing (on a computer, of course). Kids may figure out a rough and ready method of typing on their own, but in my experience, this is not nearly as efficient as mastering traditional typing skills. Unless the PC vanishes, expert typing skills will remain an advantage.
2. Paying bills by writing countless checks.
I too write very few checks and have been using online bill pay for years now. But here’s what’s lost: the sense of money as a limited resource that derives from both the use of cash on hand and the arithmetical practice of balancing a check book.
3. Buying an expensive set of encyclopedias
I remember rather fondly when the encyclopedia set arrived at my house; over the years I spent quite some time with it. Yes, I was that kid. Never mind that, this is a point that easily segues into the larger debate about digital media and print. Too much to reduce to a brief note; suffice it to say that digital texts have not exactly been linked to a renaissance of secondary education. That price tag for the set was a bit stiff though, probably why I got a World Book set instead of the gold standard Britannica.
4. Using a pay phone or racking up a big “long distance” bill
No argument on the big bill, but consider that what has been lost here is the salubrious social instinct that conceived of the enclosed phone booth in the first place.
5. Having to pay someone else to develop photographs
Hard to argue against having to pay less, but consider the psychic consequences of the digital camera. We’re obsessive self-documenters now and have never met a scene that wasn’t a picture waiting to happen. And when was the last time you actually printed out digital photos anyway. Interestingly, vintage photographic equipment is making a comeback in some circles.
6. Driving to the store to rent a movie
Gone with it are the little rites of passage that children enjoy like being allowed to walk or ride the bike to the video store for the first time and the subsequent little adventures that such journeys could bring. Of course, it’s not just about the video store. But the trajectory here suggests that we’ll not need to leave our house for much. I think it was Jane Jacobs who noted the necessary socializing influence of the countless personal and face-to-face micro-encounters attending life in the city. Suburbs diminished their number; the convenience of the Internet has reduced them even further.
7. Buying / storing music, movies, or games on physical media
These same objects also functioned as depositories of memories … when they have their own unique, tangible form. Lost also is the art of giving as a gift the perfect album or film that fits your friend and their circumstances just right. No need, iTunes gift card will do.
8. Having to endlessly search to find unique content
Adam tells us exactly what is lost: “I will never forget the day in the early 1980s when, after a long search, I finally found a rare Led Zeppelin B-Side (“Hey Hey What Can I Do”) on a “45” in a dusty bin at a small record store. It was like winning the lottery!”
9. Sending letters
Guess I’m a nostalgic old-timer. But seriously, tell me who wouldn’t get a thrill from receiving a letter from a friend. Lost is the expectation and the joy of waiting for and receiving a letter. Lost too is the relationship to time entailed by the practice of letter writing and the patience it cultivated.
10. Being without the Internet & instant, ubiquitous connectivity
Lost is solitude, introspection, uninterrupted time with others. But in fairness this does bring that unique blend of anxiety and obsessiveness that the expectation of being able to instantly communicate engenders when for whatever reason it is not immediately successful.
Admittedly, this is hardly intended to be a rigorous sociological analysis of digital culture. The “Never” in the title is hyperbolic, of course. Many of these losses are not total and they are balanced by certain gains. But as I wrote my more pessimistic rejoinders, I did notice a pattern: the tendency to collapse the distance between desire and its fulfillment. We do this either by reducing the distance in time or else the distance in effort. Make something effortless and instant and you simultaneously make it ephemeral and trivial. The consequence is the diminishment of the satisfaction and joy that attends the fulfillment.
If this is true and this pattern holds, then what our kids may never know, or at least know less of thanks to the information revolution is abiding and memorable joy and the satisfactions that attend delayed gratification and effort expended toward a desired end.
“He has perpetually occasion to rely on ideas which he has not had leisure to search to the bottom; for he is much more frequently aided by the opportunity of an idea than by its strict accuracy; and, in the long run, he risks less in making use of some false principles, than in spending his time in establishing all his principles on the basis of truth. The world is not led by long or learned demonstrations; a rapid glance at particular incidents, the daily study of the fleeting passions of the multitude, the accidents of the time, and the art of turning them to account, decide all its affairs.”
Conservative diatribe against the tenor of political discourse?
Traditionalist invective against new media and the decline of journalism?
Reactionary complaint against the culture of blogs and social media?
Curmudgeonly rant against all things digital?
Not exactly.
This was from the pen of Alexis de Tocqueville writing in the early nineteenth century about the habits of mind induced by America’s democratic society.
Apparently ours was a temperament waiting for a medium to match.
Alexis de Tocqueville arrived in America on May 10, 1831. He departed on February 20, 1832. He spent 271 days in the United States; 15 others were spent in Canada. Then, over the next several years, he wrote one of the most perceptive interpretations of American life ever written, Democracy in America.
Tocqueville touched on almost every conceivable facet of American life with a view to understanding the character of American democracy which was still something of a novelty, and certainly not guaranteed to endure. He necessarily generalized from his limited experience, but he did so brilliantly. Reading Democracy in America today, one is left with the impression that he saw through to the soul of the young republic and many of his observations remain compelling.
The tenth chapter of the second volume (Democracy in America was a hefty work) is titled “Why The Americans Are More Addicted To Practical Than To Theoretical Science.” You may recall from the essays of Leo Marx on the emergence of the concept of technology, that in Tocqueville’s day technology had not yet taken on the multifaceted sematic role it serves today. Other words and phrases, such as practical science, were invoked to discuss the various realities that we would group under technology. In this chapter, then, we find Tocqueville turning his attention to the American fascination and facility with technology.
As with all of the particularities of American society that Tocqueville discusses, he is mostly concerned to understand the consequences of the democratic political culture on the topic under consideration. At the outset of this chapter, for example, he notes that, “Equality begets in man the desire of judging of everything for himself: it gives him, in all things, a taste for the tangible and the real, a contempt for tradition and for forms.” And so it is with the culture of American science and technology.
To begin with, Tocqueville believes that the work of science may be divided into three types of endeavor:
“The first comprises the most theoretical principles, and those more abstract notions whose application is either unknown or very remote. The second is composed of those general truths which still belong to pure theory, but lead, nevertheless, by a straight and short road to practical results. Methods of application and means of execution make up the third.”
He concludes that Americans are quite good at the third, do a passable job at the second, and are least adept, but not altogether incompetent in the first. As best as I can judge, assuming the validity of his categories, this was not far off the mark. Tocqueville connects this with the absence of a leisured class:
“Nothing is more necessary to the culture of the higher sciences, or of the more elevated departments of science, than meditation; and nothing is less suited to meditation than the structure of democratic society. We do not find there, as amongst an aristocratic people, one class which clings to a state of repose because it is well off; and another which does not venture to stir because it despairs of improving its condition.”
Instead Americans are promiscuously active and, according to Tocqueville, “In the ages in which active life is the condition of almost everyone, men are therefore generally led to attach an excessive value to the rapid bursts and superficial conceptions of the intellect; and, on the other hand, to depreciate below their true standard its slower and deeper labors.”
Moreover, Tocqueville adds, “A desire to utilize knowledge is one thing; the pure desire to know is another.” Americans are decidedly in the former camp. Put otherwise, all of this means that America is much more likely to produce a Thomas Edison than an Albert Einstein. As a generalization, this seems about right still. I’m hard pressed to name theoreticians that Americans presently hold in high regard. Perhaps Stephen Hawking, but then again, he is not American. We venerate the likes of Steve Jobs and Bill Gates instead. The inventor-entrepreneur is still the preferred American icon.
Further on, Tocqueville once again leads with his familiar rough-and-ready brand of sociological analysis:
“The greater part of the men who constitute these nations are extremely eager in the pursuit of actual and physical gratification. As they are always dissatisfied with the position which they occupy, and are always free to leave it, they think of nothing but the means of changing their fortune, or of increasing it.”
From this premise, Tocqueville then launches into the following observations which strike me as more true than not:
“To minds thus predisposed, every new method which leads by a shorter road to wealth, every machine which spares labor, every instrument which diminishes the cost of production, every discovery which facilitates pleasures or augments them, seems to be the grandest effort of the human intellect. It is chiefly from these motives that a democratic people addicts itself to scientific pursuits-that it understands, and that it respects them. In aristocratic ages, science is more particularly called upon to furnish gratification to the mind; in democracies, to the body. You may be sure that the more a nation is democratic, enlightened, and free, the greater will be the number of these interested promoters of scientific genius, and the more will discoveries immediately applicable to productive industry confer gain, fame, and even power on their authors.”
And that is why Tocqueville is still in print — his analysis still resonates even if we may quibble with the details. This is as succinct a diagnosis as one could hope for of the distinct blend of technology and economics that we might label America’s techno-start-up culture.
Tocqueville, incidentally, was not being wholly critical in his observations. He notes, for instance, that, “These very Americans, who have not discovered one of the general laws of mechanics, have introduced into navigation an engine which changes the aspect of the world.”
What Tocqueville could not quite anticipate is the degree to which technology would eventually drive theoretical science. In this regard, his analysis falls short. But his insight into the American temperament and its stance toward technology has proved remarkably durable.
One final note, given recent posts, Tocqueville’s analysis is also an argument against a hard technological determinism. The history of technology in America takes its unique shape, at least in part, due to America’s political economy. But, mindful of the reciprocal nature of technology’s relationships to society, it would be only fair to note that technology has since rather significantly reshaped America’s political economy.
Alexis de Tocqueville by Théodore Chassériau, 1850