Technological Enchantments and the End of Modernity

Dissatisfied with existing theories of secularization, Charles Taylor proposed his own account in his much-discussed 2007 book, A Secular Age. Taylor argued that traditional secularization stories were at best inadequate. They were in adequate because they were what Taylor called “subtraction stories.” According to such stories, secularization is what you get when belief in God goes away or when the Church loses its cultural power or when religious language is excised from the public square, etc. Taylor did not believe that secular society is simply what you have left when you slough off religious belief and institutions. Rather, he argued for the rise of something novel, exclusive humanism, which could fill the role that religious experience once served to provide “fullness” to people’s lives. Additionally, he argued that secularism as he understood it was not something that characterized only the unbelievers in a society. It was also the context for and conditions of belief and thus no one escaped its consequences.

I find Taylor’s work compelling and instructive, however, I bring it up only to make use of a small part of his multi-faceted and nuanced argument: his understanding of disenchantment.

The enchanted world was one of the features of pre-modern society which had to be overcome in order for a secular world, in Taylor’s sense, to emerge. The enchanted world as he describes it yields a particular experience of the self, what Taylor calls the “porous self” which later gives way to the modern “buffered self.”

Before moving on to explain what Taylor means by the porous self, I think it is useful to emphasize that Taylor is not after a theory of the self that pre-modern people may or not have held. Rather, he is after something more subjective, the background of lived experience or what was naively taken for granted. Taylor describes what he trying to get at as “the construal we just live in, without ever being aware of it as a construal, or–for the most of us–without ever even formulating it.”

This is a useful way of approaching these matters because rarely do we carry around with us a fully developed theory we could articulate to explain our beliefs and actions. Much of what we say and do arises from a tacit understanding of the world and our place in it, an understanding we might be hard pressed to put into words.

This is helpful because when I talk about technological re-enchantment, I don’t mean to suggest that anyone today would claim that there are spirits in the iPhone like a medieval peasant might have believed there were dryads in the forest. Nonetheless, we may experience our technology in a way that is functionally similar or analogous to the premodern experience of enchantment. And we may do so naively, that is without reflection and in a taken for granted manner.

Taylor’s discussion of enchantment unfolds as a theory of the self, and his understanding enchantment begins with the question of meaning. In a our modern disenchanted world, meaning arises only from mind, and the human mind is the only kind of mind there is. Nothing external to the human mind bears any meaning in itself. Moreover, there are no non-human agents in the world, either of matter or spirit.

By contrast, things (and spirits) in the enchanted world have the “power of exogenously inducing or imposing meaning,” a meaning that is independent of the perceiver. A meaning that someone may be forced to reckon with whether they would like to or not. Additionally, in the enchanted world objects can have a causal power. The “charged” objects, Taylor explains, “have what we usually call ‘magic’ powers.” Crucially, this power may be either benevolent or malevolent. The objects in question may bring blessing or trouble, cure or disease, rescue or danger.

“Thus in the enchanted world,” Taylor concludes, “charged things can impose meanings, and bring about physical outcomes proportionate to their meanings.” He calls these “influence” and ” causal power,” respectively.

The corresponding experience of the self that arises from this state of affairs is key for my purposes. Boundaries in the enchanted world are decidedly fuzzy. Taylor writes that the enchanted world “shows a perplexing absence of certain boundaries which seem to us essential.” In particular, “the boundary between mind and world is porous.” The porous self that corresponds to an enchanted world is “vulnerable, to spirits, demons, cosmic forces. And along with this go certain fears which can grip it in certain circumstances.”

The buffered self characteristic of the disenchanted world is, by contrast, “invulnerable” and “master of the meanings of things for it.” It is also immune to the fears that may grip the porous self. It is sealed off from the world, its boundaries are not at all fuzzy, meaning resides neatly within its own mind, it occupies a world of inert matter. It is autonomous and self-possessed. It is in other words, a thoroughly modern individual.

It would be fair to ask at this point, what any of this has to do with technology. My working hypothesis is something along these lines: contemporary technologies have taken on attributes that render their presence in our lived understanding of the world analogous to that of the “charged” objects of the enchanted world Taylor describes.

Two clarifications. First, I don’t mean to suggest that contemporary technology is in any literal sense magical. I am no more committed to that conclusion than a contemporary historian is committed to attributing real power to medieval relics when she describes them as enchanted.

Secondly, I don’t have in mind every kind of contemporary technology. Chiefly, I have in view technologies and objects that appear to be animated (as I’ve described elsewhere), and also processes, real or rhetorical, such as AI, automation, algorithms, and Big Data, which constitute something like an immaterial field of often inscrutable forces within which we conduct our lives.

In this technologically enchanted world we inhabit, then, we encounter objects and forces that, to borrow Taylor’s terminology, both influence us and exert causal power over our affairs. Some of these objects and forces appear also to have an agency independent of any human actor. I want to reiterate again that I am not talking about what some, including myself, would want to argue is actually the case: that technology is never wholly independent of human agency. Rather, like Taylor, I’m after what our unreflective experience of the world feels like.

Our technologically enchanted objects confront us with meaning that imposes itself on us and with which we must reckon. We turn to our technologies for help and invest our hope in their power. We also fear our technologies and see them as the cause of our troubles. The technological forces we encounter are sometimes benevolent but just as often malevolent forces undermining our efforts and derailing our projects.

It is not only that technological objects have the potential to empower us and sometimes even fill us with wonder. It is also that we experience these objects and forces as important determiners of our weal and woe and that they act upon us independently of our control and without our understanding. We are, in other words, vulnerable, and our autonomy is compromised by the lines of technologically distributed agency that intersect our will and desires.

This means, then, that the experience of the self that emerges out of this technologically enchanted milieu more resembles the porous self of the previously enchanted world than the buffered self that corresponded to disenchanted modernity. This is the key point at the end of this line of thought: a technologically enchanted world is inhospitable to the characteristically modern self. Postmodernity, then, at least the postmodern experience of the self, may be understood as an effect of our technological milieu.

“We are as gods,” Stewart Brand famously declared, “and might as well get good at it.” I suspect, though, that while the technologically enchanted world may tempt us with that possibility, most will experience it in a decidedly more creaturely and thus precarious mode. And not unlike the previous age of enchantment, our age will yield its own forms of serfdom, its own clerical class whose esoteric knowledge we turn to navigate the promise and perils of enchantment, and its own eschatological hope.


If you’ve appreciated what you’ve read, consider supporting the writer.

A Man Walks Into A Bank

I walked into my bank a few days ago and found that the lobby had a different look. The space had been rearranged to highlight a new addition: an automated teller. While I was being helped, I overheard an exchange between a customer in line behind me and a bank worker whose new role appeared to be determining whether customers could be served by the automated teller and directing them in that direction.

She was upbeat about the automated teller and how it would speed things up for customers. The young man talking with her posed a question that occurred to me as I listened but that I’m not sure I would have had the temerity to raise: “Aren’t you afraid that pretty soon they’re not going to need you guys anymore?”

The bank employee was entirely unperturbed, or at least she pretended to be. “No, I’m not worried about that,” she said. “I know they’re going to keep us around.”

I hope they do, but I don’t share her optimism. I was reminded of passage from Neil Postman’s Technopoly: The Surrender of Culture to Technology. Writing in the early ’90s about the impact of television on education, Postman commented on teachers who enthusiastically embraced the transformations wrought by television. Believing the modern school system, and thus the teacher’s career, to be the product of print culture, Postman wrote,

[…] surely, there is something perverse about schoolteachers’ being enthusiastic about what is happening. Such enthusiasm always calls to my mind an image of some turn-of-the-century blacksmith who not only sings the praises of the automobile but also believes that his business will be enhanced by it. We know now that his business was not enhanced by it; it was rendered obsolete by it, as perhaps the clearheaded blacksmiths knew. What could they have done? Weep, if nothing else.

We might find it in us to weep, too, or at least acknowledge the losses, even when the gains are real and important, which they are not always. Perhaps we might also refuse a degree of personal convenience from time to time, or every time if we find it in us to do so, in order to embody principles that might at least, if nothing else, demonstrate a degree of solidarity with those who will not be the winners in the emerging digital economy.

Postman believed that computer technology created a similar situation to that of the blacksmiths, “for here too we have winners and losers.”

“There can be no disputing that the computer has increased the power of large-scale organizations like the armed forces, or airline companies or banks or tax-collecting agencies. And it is equally clear that the computer is now indispensable to high-level researchers in physics and other natural sciences. But to what extend has computer technology been an advantage to the masses of people? To steelworkers, vegetable-store owners, teachers, garage mechanics, musicians, bricklayers, dentists, and most of the rest into whose lives the computer now intrudes? Their private matters have been made more accessible to powerful institutions. They are more easily tracked and controlled; are subjected to more examinations; are increasingly mystified by the decisions made about them; are often reduced to mere numerical objects. They are inundated by junk mail. They are easy targets for advertising agencies …. In a word, almost nothing that they need happens to the losers. Which is why they are the losers.

It is to be expected that the winners will encourage the losers to be enthusiastic about computer technology. That is the way of winners … They also tell them that their lives will be conducted more efficiently. But discreetly they neglect to say from whose point of view the efficiency is warranted or what might be its costs.”

The religion of technology is a secular faith, and as such it should, at least, have the decency of striking a tragic note.

Algorithms Who Art in Apps, Hallowed Be Thy Code

If you want to understand the status of algorithms in our collective imagination, Ian Bogost proposes the following exercise in his recent essay in the Atlantic: “The next time you see someone talking about algorithms, replace the term with ‘God’ and ask yourself if the sense changes any?”

If Bogost is right, then more often than not you will find the sense of the statement entirely unchanged. This is because, in his view, “Our supposedly algorithmic culture is not a material phenomenon so much as a devotional one, a supplication made to the computers we have allowed to replace gods in our minds, even as we simultaneously claim that science has made us impervious to religion.” Bogost goes on to say that this development is part of a “larger trend” whereby “Enlightenment ideas like reason and science are beginning to flip into their opposites.” Science and technology, he fears, “have turned into a new type of theology.”

It’s not the algorithms themselves that Bogost is targeting; it is how we think and talk about them that worries him. In fact, Bogost’s chief concern is that how we talk about algorithms is impeding our ability to think clearly about them and their place in society. This is where the god-talk comes in. Bogost deploys a variety of religious categories to characterize the present fascination with algorithms.

Bogost believes “algorithms hold a special station in the new technological temple because computers have become our favorite idols.” Later on he writes, “the algorithmic metaphor gives us a distorted, theological view of computational action.” Additionally, “Data has become just as theologized as algorithms, especially ‘big data,’ whose name is meant to elevate information to the level of celestial infinity.” “We don’t want an algorithmic culture,” he concludes, “especially if that phrase just euphemizes a corporate theocracy.” The analogy to religious belief is a compelling rhetorical move. It vividly illuminates Bogost’s key claim: the idea of an “algorithm” now functions as a metaphor that conceals more than it reveals.

He prepares the ground for this claim by reminding us of earlier technological metaphors that ultimately obscured important realities. The metaphor of the mind as computer, for example, “reaches the rank of religious fervor when we choose to believe, as some do, that we can simulate cognition through computation and achieve the singularity.” Similarly, the metaphor of the machine, which is really to say the abstract idea of a machine, yields a profound misunderstanding of mechanical automation in the realm of manufacturing. Bogost reminds us that bringing consumer goods to market still “requires intricate, repetitive human effort.” Manufacturing, as it turns out, “isn’t as machinic nor as automated as we think it is.”

Likewise, the idea of an algorithm, as it is bandied about in public discourse, is a metaphorical abstraction that obscures how various digital and analog components, including human action, come together to produce the effects we carelessly attribute to algorithms. Near the end of the essay, Bogost sums it up this way:

“the algorithm has taken on a particularly mythical role in our technology-obsessed era, one that has allowed it wear the garb of divinity. Concepts like ‘algorithm’ have become sloppy shorthands, slang terms for the act of mistaking multipart complex systems for simple, singular ones. Of treating computation theologically rather than scientifically or culturally.”

But why does any of this matter? It matters, Bogost insists, because this way of thinking blinds us in two important ways. First, our sloppy shorthand “allows us to chalk up any kind of computational social change as pre-determined and inevitable,” allowing the perpetual deflection of responsibility for the consequences of technological change. The apotheosis of the algorithm encourages what I’ve elsewhere labeled a Borg Complex, an attitude toward technological change aptly summed by the phrase, “Resistance is futile.” It’s a way of thinking about technology that forecloses the possibility of thinking about and taking responsibility for our choices regarding the development, adoption, and implementation of new technologies. Secondly, Bogost rightly fears that this “theological” way of thinking about algorithms may cause us to forget that computational systems can offer only one, necessarily limited perspective on the world. “The first error,” Bogost writes, “turns computers into gods, the second treats their outputs as scripture.”

______________________

Bogost is right to challenge the quasi-religious reverence sometimes exhibited toward technology. It is, as he fears, an impediment to clear thinking. Indeed, he is not the only one calling for the secularization of our technological endeavors. Jaron Lanier has spoken at length about the introduction of religious thinking into the field of AI. In a recent interview, Lanier expressed his concerns this way:

“There is a social and psychological phenomenon that has been going on for some decades now:  A core of technically proficient, digitally-minded people reject traditional religions and superstitions. They set out to come up with a better, more scientific framework. But then they re-create versions of those old religious superstitions! In the technical world these superstitions are just as confusing and just as damaging as before, and in similar ways.”

While Lanier’s concerns are similar to Bogost’s, it may be worth noting that Lanier’s use of religious categories is rather more concrete. As far as I can tell, Bogost deploys a religious frame as a rhetorical device, and rather effectively so. Lanier’s criticisms, however, have been aroused by religiously intoned expressions of a desire for transcendence voiced by denizens of the tech world themselves.

But such expressions are hardly new, nor are they relegated to the realm of AI. In The Religion of Technology: The Divinity of Man and the Spirit of Invention, David Noble rightly insisted that “modern technology and modern faith are neither complements nor opposites, nor do they represent succeeding stages of human development. They are merged, and always have been, the technological enterprise being, at the same time, an essentially religious endeavor.”

So that no one would misunderstand his meaning, he added,

“This is not meant in a merely metaphorical sense, to suggest that technology is similar to religion in that it evokes religious emotions of omnipotence, devotion, and awe, or that it has become a new (secular) religion in and of itself, with its own clerical caste, arcane rituals, and articles of faith. Rather it is meant literally and historically, to indicate that modern technology and religion have evolved together and that, as a result, the technological enterprise has been and remains suffused with religious belief.”

Along with chapters on the space program, atomic weapons, and biotechnology, Noble devoted a chapter to the history AI, titled “The Immortal Mind.” Noble found that AI research had often been inspired by a curious fixation on the achievement of god-like, disembodied intelligence as a step toward personal immortality. Many of the sentiments and aspirations that Noble identifies in figures as diverse as George Boole, Claude Shannon, Alan Turing, Edward Fredkin, Marvin Minsky, Daniel Crevier, Danny Hillis, and Hans Moravec–all of them influential theorists and practitioners in the development of AI–find their consummation in the Singularity movement. The movement envisions a time, 2045 is frequently suggested, when the distinction between machines and humans will blur and humanity as we know it will eclipsed. Before Ray Kurzweil, the chief prophet of the Singularity, wrote about “spiritual machines,” Noble had astutely anticipated how the trajectories of AI, Internet, Virtual Reality, and Artificial Life research were all converging on the age-old quest for the immortal life. Noble, who died in 2010, must have read the work of Kurzweil and company as a remarkable validation of his thesis in The Religion of Technology.

Interestingly, the sentiments that Noble documented alternated between the heady thrill of creating non-human Minds and non-human Life, on the one hand, and, on the other, the equally heady thrill of pursuing the possibility of radical life-extension and even immortality. Frankenstein meets Faust we might say. Humanity plays god in order to bestow god’s gifts on itself. Noble cites one Artificial Life researcher who explains, “I fee like God; in fact, I am God to the universes I create,” and another who declares, “Technology will soon enable human beings to change into something else altogether [and thereby] escape the human condition.” Ultimately, these two aspirations come together into a grand techno-eschatological vision, expressed here by Hans Moravec:

“Our speculation ends in a supercivilization, the synthesis of all solar system life, constantly improving and extending itself, spreading outward from the sun, converting non-life into mind …. This process might convert the entire universe into an extended thinking entity … the thinking universe … an eternity of pure cerebration.”

Little wonder that Pamela McCorduck, who has been chronicling the progress of AI since the early 1980s, can say, “The enterprise is a god-like one. The invention–the finding within–of gods represents our reach for the transcendent.” And, lest we forget where we began, a more earth-bound, but no less eschatological hope was expressed by Edward Fredkin in his MIT and Stanford courses on “saving the world.” He hoped for a “global algorithm” that “would lead to peace and harmony.” I would suggest that similar aspirations are expressed by those who believe that Big Data will yield a God’s-eye view of human society, providing wisdom and guidance that would be otherwise inaccessible to ordinary human forms of knowing and thinking.

Perhaps this should not be altogether surprising. As the old saying has it, the Grand Canyon wasn’t formed by someone dragging a stick. This is just a way of saying that causes must be commensurate to the effects they produce. Grand technological projects such as space flight, the harnessing of atomic energy, and the pursuit of artificial intelligence are massive undertakings requiring stupendous investments of time, labor, and resources. What kind of motives are sufficient to generate those sorts of expenditures? You’ll need something more than whim, to put it mildly. You may need something akin to religious devotion. Would we have attempted to put a man on the moon apart from the ideological frame provided Cold War, which cast space exploration as a field of civilizational battle for survival? Consider, as a more recent example, what drives Elon Musk’s pursuit of interplanetary space travel.

______________________

Without diminishing the criticisms offered by either Bogost or Lanier, Noble’s historical investigation into the roots of divinized or theologized technology reminds us that the roots of the disorder run much deeper than we might initially imagine. Noble’s own genealogy traces the origin of the religion of technology to the turn of the first millennium. It emerges out of a volatile mix of millenarian dreams, apocalyptic fervor, mechanical innovation, and monastic piety. It’s evolution proceeds apace through the Renaissance, finding one of its most ardent prophets in the Elizabethan statesman, Francis Bacon. Even through the Enlightenment, the religion of technology flourished. In fact, the Enlightenment may have been a decisive moment in the history of the religion of technology.

In the essay with which we began, Ian Bogost framed the emergence of techno-religious thinking as a departure from the ideals of reason and science associated with the Enlightenment. This is not altogether incidental to Bogost’s argument. When he talks about the “theological” thinking that plagues our understanding of algorithms, Bogost is not working with a neutral, value-free, all-purpose definition of what constitutes the religious or the theological; there’s almost certainly no such definition available. It wouldn’t be too far from the mark, I think, to say that Bogost is working with what we might classify as an Enlightenment understanding of Religion, one that characterizes it as Reason’s Other, i.e. as a-rational if not altogether irrational, superstitious, authoritarian, and pernicious. For his part, Lanier appears to be working with similar assumptions.

Noble’s work complicates this picture, to say the least. The Enlightenment did not, as it turns out, vanquish Religion, driving it far from the pure realms of Science and Technology. In fact, to the degree that the radical Enlightenment’s assault on religious faith was successful, it empowered the religion of technology. To put this another way, the Enlightenment–and, yes, we are painting with broad strokes here–did not do away with the notions of Providence, Heaven, and Grace. Rather, the Enlightenment re-named these Progress, Utopia, and Technology respectively. To borrow a phrase, the Enlightenment immanentized the eschaton. If heaven had been understood as a transcendent goal achieved with the aid of divine grace within the context of the providentially ordered unfolding of human history, it became a Utopian vision, a heaven on earth, achieved by the ministrations Science and Technology within the context of Progress, an inexorable force driving history toward its Utopian consummation.

As historian Leo Marx has put it, the West’s “dominant belief system turned on the idea of technical innovation as a primary agent of progress.” Indeed, the further Western culture proceeded down the path of secularization as it is traditionally understood, the greater the emphasis on technology as the principle agent of change. Marx observed that by the late nineteenth century, “the simple republican formula for generating progress by directing improved technical means to societal ends was imperceptibly transformed into a quite different technocratic commitment to improving ‘technology’ as the basis and the measure of — as all but constituting — the progress of society.”

When the prophets of the Singularity preach the gospel of transhumanism, they are not abandoning the Enlightenment heritage; they are simply embracing it’s fullest expression. As Bruno Latour has argued, modernity has never perfectly sustained the purity of the distinctions that were the self-declared hallmarks of its own superiority. Modernity characterized itself as a movement of secularization and differentiation, what Latour, with not a little irony, labels processes of purification. Science, politics, law, religion, ethics–these are all sharply distinguished and segregated from one another in the modern world, distinguishing it from the primitive pre-modern world. But it turns out that these spheres of human experience stubbornly resist the neat distinctions modernity sought to impose. Hybridization unfolds alongside purification, and Noble’s work has demonstrated how the lines between technology, sometimes reckoned the most coldly rational of human projects, is deeply contaminated by religion, often regarded by the same people as the most irrational of human projects.

But not just any religion. Earlier I suggested that when Bogost characterizes our thinking about algorithms as “theological,” he is almost certainly assuming a particular kind of theology. This is why it is important to classify the religion of technology more precisely as a Christian heresy. It is in Western Christianity that Noble found the roots of the religion of technology, and it is in the context of post-Christian world that it has presently flourished.

It is Christian insofar as its aspirations that are like those nurtured by the Christian faith, such as the conscious persistence of a soul after the death of the body. Noble cites Daniel Crevier, who referencing the “Judeo-Christian tradition” suggested that “religious beliefs, and particularly the belief in survival after death, are not incompatible with the idea that the mind emerges from physical phenomena.” This is noted on the way to explaining that a machine-based material support could be found for the mind, which leads Noble to quip. “Christ was resurrected in a new body; why not a machine?” Reporting on his study of the famed Santa Fe Institute in Santa Fe, New Mexico, anthropologist Stefan Helmreich observed, “Judeo-Christian stories of the creation and maintenance of the world haunted my informants’ discussions of why computers might be ‘worlds’ or ‘universes,’ …. a tradition that includes stories from the Old and New Testaments (stories of creation and salvation).”

It is a heresy insofar as it departs from traditional Christian teaching regarding the givenness of human nature, the moral dimensions of humanity’s brokenness, the gracious agency of God in the salvation of humanity, and the resurrection of the body, to name a few. Having said as much, it would seem that one could perhaps conceive of the religion of technology as an imaginative account of how God might fulfill purposes that were initially revealed in incidental, pre-scientific garb. In other words, we might frame the religion of technology not so much as a Christian heresy, but rather as (post-)Christian fan-fiction, an elaborate imagining of how the hopes articulated by the Christian faith will materialize as a consequence of human ingenuity in the absence of divine action.

______________________

Near the end of The Religion of Technology, David Noble forcefully articulated the dangers posed by a blind faith in technology. “Lost in their essentially religious reveries,” Noble warned, “the technologists themselves have been blind to, or at least have displayed blithe disregard for, the harmful ends toward which their work has been directed.” Citing another historian of technology, Noble added, “The religion of technology, in the end, ‘rests on extravagant hopes which are only meaningful in the context of transcendent belief in a religious God, hopes for a total salvation which technology cannot fulfill …. By striving for the impossible, [we] run the risk of destroying the good life that is possible.’ Put simply, the technological pursuit of salvation has become a threat to our survival.” I suspect that neither Bogost nor Lanier would disagree with Noble on this score.

There is another significant point at which the religion of technology departs from its antecedent: “The millenarian promise of restoring mankind to its original Godlike perfection–the underlying premise of the religion of technology–was never meant to be universal.” Instead, the salvation it promises is limited finally to the very few will be able to afford it; it is for neither the poor nor the weak. Nor, would it seem, is it for those who have found a measure of joy or peace or beauty within the bounds of the human condition as we now experience it, frail as it may be.

Lastly, it is worth noting that the religion of technology appears to have no doctrine of final judgment. This is not altogether surprising given that, as Bogost warned, the divinizing of technology carries the curious effect of absolving us of responsibility for the tools that we fashion and the uses to which they are put.

I have no neat series of solutions to tie all of this up; rather I will give the last word to Wendell Berry:

“To recover from our disease of limitlessness, we will have to give up the idea that we have a right to be godlike animals, that we are potentially omniscient and omnipotent, ready to discover ‘the secret of the universe.’ We will have to start over, with a different and much older premise: the naturalness and, for creatures of limited intelligence, the necessity, of limits. We must learn again to ask how we can make the most of what we are, what we have, what we have been given.”


You can subscribe to my newsletter on technology and society here.

Silencing the Heretics: How the Faithful Respond to Criticism of Technology

I started to write a post about a few unhinged reactions to an essay published by Nicholas Carr in this weekend’s WSJ, “Automation Makes Us Dumb.”  Then I realized that I already wrote that post back in 2010. I’m republishing “A God that Limps” below, with slight revisions, and adding a discussion of the reactions to Carr. 

Our technologies are like our children: we react with reflexive and sometimes intense defensiveness if either is criticized. Several years ago, while teaching at a small private high school, I forwarded an article to my colleagues that raised some questions about the efficacy of computers in education. This was a mistake. The article appeared in a respectable journal, was judicious in its tone, and cautious in its conclusions. I didn’t think then, nor do I now, that it was at all controversial. In fact, I imagined that given the setting it would be of at least passing interest. However, within a handful of minutes (minutes!)—hardly enough time to skim, much less read, the article—I was receiving rather pointed, even angry replies.

I was mystified, and not a little amused, by the responses. Mostly though, I began to think about why this measured and cautious article evoked such a passionate response. Around the same time I stumbled upon Wendell Berry’s essay titled, somewhat provocatively, “Why I am Not Going to Buy a Computer.” More arresting than the essay itself, however, were the letters that came in to Harper’s. These letters, which now typically appear alongside the essay whenever it is anthologized, were caustic and condescending. In response, Berry wrote,

The foregoing letters surprised me with the intensity of the feelings they expressed. According to the writers’ testimony, there is nothing wrong with their computers; they are utterly satisfied with them and all that they stand for. My correspondents are certain that I am wrong and that I am, moreover, on the losing side, a side already relegated to the dustbin of history. And yet they grow huffy and condescending over my tiny dissent. What are they so anxious about?

Precisely my question. Whence the hostility, defensiveness, agitation, and indignant, self-righteous anxiety?

I’m typing these words on a laptop, and they will appear on a blog that exists on the Internet.  Clearly I am not, strictly speaking, a Luddite. (Although, in light of Thomas Pynchon’s analysis of the Luddite as Badass, there may be a certain appeal.) Yet, I do believe an uncritical embrace of technology may prove fateful, if not Faustian.

The stakes are high. We can hardly exaggerate the revolutionary character of certain technologies throughout history:  the wheel, writing, the gun, the printing press, the steam engine, the automobile, the radio, the television, the Internet. And that is a very partial list. Katherine Hayles has gone so far as to suggest that, as a species, we have “codeveloped with technologies; indeed, it is no exaggeration,” she writes in Electronic Literature, “to say modern humans literally would not have come into existence without technology.”

We are, perhaps because of the pace of technological innovation, quite conscious of the place and power of technology in our society and in our own lives. We joke about our technological addictions, but it is sometimes a rather nervous punchline. It makes sense to ask questions. Technology, it has been said, is a god that limps. It dazzles and performs wonders, but it can frustrate and wreak havoc. Good sense seems to suggest that we avoid, as Thoreau put it, becoming tools of our tools. This doesn’t entail burning the machine; it may only require a little moderation. At a minimum, it means creating, as far as we are able, a critical distance from our toys and tools, and that requires searching criticism.

And we are back where we began. We appear to be allergic to just that kind of searching criticism. So here is my question again:  Why do we react so defensively when we hear someone criticize our technologies?

And so ended my earlier post. Now consider a handful of responses to Carr’s article, “Automation Makes Us Dumb.” Better yet, read the article, if you haven’t already, and then come back for the responses.

Let’s start with a couple of tweets by Joshua Gans, a professor of management at the University of Toronto.

Then there was this from entrepreneur, Marc Andreessen:

Even better are some of the replies attached to Andreessen’s tweet. I’ll transcribe a few of those here for your amusement.

“Why does he want to be stuck doing repetitive mind-numbing tasks?”

“‘These automatic jobs are horrible!’ ‘Stop killing these horrible jobs with automation!'” [Sarcasm implied.]

“by his reasoning the steam engine makes us weaklings, yet we’ve seen the opposite. so maybe the best intel is ahead”

“Let’s forget him, he’s done so much damage to our industry, he is just interested in profiting from his provocations”

“Nick clearly hasn’t understood the true essence of being ‘human’. Tech is an ‘enabler’ and aids to assist in that process.”

“This op-ed is just a Luddite screed dressed in drag. It follows the dystopian view of ‘Wall-E’.”

There you have it. I’ll let you tally up the logical fallacies.

Honestly, I’m stunned by the degree of apparently willful ignorance exhibited by these comments. The best I can say for them is that they are based on a glance at the title of Carr’s article and nothing more. It would be much more worrisome if these individuals had actually read the article and still managed to make these comments that betray no awareness of what Carr actually wrote.

More than once, Carr makes clear that he is not opposed to automation in principle. The last several paragraphs of the article describe how we might go forward with automation in a way that avoids some serious pitfalls. In other words, Carr is saying, “Automate, but do it wisely.” What a Luddite!

When I wrote in 2010, I had not yet formulated the idea of a Borg Complex, but this inability to rationally or calmly abide any criticism of technology is surely pure, undistilled Borg Complex, complete with Luddite slurs!

I’ll continue to insist that we are in desperate need of serious thinking about the powers that we are gaining through our technologies. It seems, however, that there is a class of people who are hell-bent on shutting down any and all criticism of technology. If the criticism is misguided or unsubstantiated, then it should be refuted. Dismissing criticism while giving absolutely no evidence of having understood it, on the other hand, helps no one at all.

I come back to David Noble’s description of the religion of technology often, but only because of how useful it is as a way of understanding techno-scientific culture. When technology is a religion, when we embrace it with blind faith, when we anchor our hope in it, when we love it as ourselves–then any criticism of technology will be understood as either heresy or sacrilege. And that seems to be a pretty good way of characterizing the responses to tech criticism I’ve been discussing: the impassioned reactions of the faithful to sacrilegious heresy.