On the Merits of Inconclusive Debates

On social media, criticism too often takes the form of aggressively ironic derision performed for those who are already prone to agree. It can also be challenging, although not impossible, to find sustained discussions that are both civil and well-reasoned. Relatedly, one of the complaints I frequently hear about online debates, one that I’ve made myself, is that no one ever changes their mind as a result of their online exchanges, no matter how prolonged or passionate those exchanges might be.

Of course, there are many reasons for this. For instance, we are, it seems to me, much less likely to surrender our positions, particularly our cherished convictions, in a public forum. Most of us are not so humble. Moreover, shifts in perspective or intellectual reversals tend to happen gradually. So much so, that one may not even be able to pinpoint the moment of conversion. In any case, they rarely happen in the heat of intellectual battle. And that last metaphor is also part of the problem. There’s a tendency to characterize our intellectual life as a quest to vanquish all foes rather than as a mutual, dialectical pursuit of knowledge and wisdom. A change of mind, then, is experienced as a defeat rather than a step toward better understanding.

All of this is really just a way of introducing the following passage from Oliver O’Donovan’s Self, World, and Time. O’Donovan reminds us, reminds me, that there is value even in an inconclusive debate or conversation, because, again, the point is not to be proven right.

“Let us suppose that I disapprove of the death penalty, and take up the cudgels against someone who defends it. As our discussion proceeds, certain things will become clear. One is that there are various reasons for disapproving of the death penalty, some of which may plausibly claim a perennial moral truth, while others are more circumstantial. If my opponent forces me to think hard, I shall understand better what social and historical conditions have made the death penalty appear reasonable to past generations, and I shall have to ask if those conditions could ever recur. I shall come to see that my view of the matter is part and parcel of a wider philosophy of penal justice and governmental responsibility, and I shall be forced to elucidate that philosophy more fully and to test its capacity to shed illumination on other questions, too. None of this could I have gained from talking to those who agreed with me. What it amounts to is that if at the end of the argument I still say, ‘I disapprove of the death penalty!’ I know much better than before what I mean by it.”

Thanks to Alastair Roberts for drawing my attention to it. 

Beyond Analog v. Digital

“If it’s not Scottish, it’s crap!” This was the slogan of a store called All Things Scottish featured in a recurring Mike Myers skit on SNL in the mid-90s. Sometimes discussion of digital technology takes a similar turn. “If it’s not digital, it’s crap!” proponents seem to say. “If it’s not analog, it crap!” critics might retort.

In fact, concluding that all things digital superior to all things analog turns out to be just as misguided as the opposite judgment. After we’ve concluded that digital practices are no less “real” than analog practices and that a preference for analog practices is not necessarily an instance of naive or pretentious fetishism, then the harder, more interesting work of begins. That work involves figuring out what tools best fit the job and the person. As an interesting instance of this process I submit an exchange that unfolded yesterday on Twitter following this tweet by Jeremy Antley:

As someone preparing to write a dissertation in the coming months, I found this really interesting. I’ll let you follow the exchange as it took off from that tweet, but the gist of it is that the idea of writing out a long draft by hand resonated with a few (myself included), piqued the curiosity of others, and seemed entirely unworkable to some.

The point illustrated by this exchange is, to begin with, that writing by hand and writing with a word processing tool are not, at the experiential level, the same thing. There are things you can do with one method that you cannot do with the other, and vice versa. One medium limits you in ways that the other does not, and vice versa. But that is just the first step.

There’s a tendency to stop with that realization and then generalize one’s own preferences to apply to all people under all circumstances. It would be better to continue by asking how the relative strengths and weaknesses of a given medium fit with the requirements of the task at hand and with one’s own proclivities, tendencies, and idiosyncrasies. We are diverse enough, and the things we set out to accomplish sufficiently varied that it would be impossible to generalize about the superiority of digital tools or analog tools as a class.

I’ll leave you with one more related item to consider. In his article for Wired, Brandon Keim explores a variety of studies addressing the differences between reading on paper and reading on digital screens. The piece struck me as evenhanded and judicious in its conclusions.

Here’s the closing paragraph, and do note the good counsel to refuse Borg Complex claims of inevitability:

“’We should be wary of saying, “That’s the way we’re going to read in the future anyways, so why resist?”‘ said Mangen. ‘There is something to deep reading and deep thinking that is worth making an effort to preserve.’ Whether we need paper to do that remains to be seen. For now, though, there’s still plenty of life in those dead trees.”

One last point, as Keim’s article suggests, the differences between digital and analog reading experiences are often linked to the sometimes overlooked fact that we are bodies and our bodies are a consequential aspect of how we experience this world. As we evaluate our digital and analog tools, we do well not lose sight of this fact.

 

The Treadmill Always Wins

I recently suggested that it’s good to occasionally ask ourselves, “Why do we read?” That question was prompted in part by the unhealthy tendencies that I find myself struggling to resist in the context of online reading. These tendencies are best summed up by the phrase “reading to have read,” a phrase I borrowed from Alan Jacobs’ excellent The Pleasures of Reading in an Age of Distraction. Incidentally, telling you to read this book is only the first piece of advice that I’ll offer in this post, but it might be the best.

As it turns out, Jacobs revisited his comments on this score in a post discussing a relatively new speed reading app called Spritz. The app sells itself as “reading reimagined,” rather brutally so if you ask me. The app flashes words at rates up to 600 words per minute featuring its “patent-pending Redicle” technology. In any case, you can follow the links to learn more about it if you’re so inclined. Near the close of his post, after citing some trenchant observations by Julie Sedivy, Jacobs observed,

It’s all too easy to imagine people who are taken with Spritz making decisions about what to read based on what’s amenable to ‘spritzing.’ But that’s inevitable as long as [they are thinking] of reading as something to have done, something to get through.”

Sedivy and Jacobs both are pointing out one of the more insidious temptations technology poses, the temptation to fit ourselves to the tool. In this case, the temptation is to prefer the kind of reading that lends itself to the questionable efficiencies afforded by Spritz. As Sedivy puts it, these are “texts with simpler sentences, sentences in which the complexity of information is evenly distributed, sentences that avoid unexpected twists or turns, and sentences which explicitly lay out their intended meanings, without requiring readers to mentally fill in the blanks through inference.”

In his post, Jacobs also linked to Ian Bogost’s insightful take on Spritz which was titled, interestingly enough, “Reading to Have Read.” Bogost questions the supposedly scientific claims made for Spritz by its creators. More importantly, though, he takes Spritz to be symptomatic of a larger problem:

“In today’s attention economy, reading materials (we call it ‘content’ now) have ceased to be created and disseminated for understanding. Instead, they exist first (and primarily) for mere encounter. This condition doesn’t necessarily signal the degradation of reading; it also arises from the surplus of content we are invited and even expected to read. But it’s a Sisyphean task. We can no longer reasonably hope to read all our emails, let alone our friends’ Facebook updates or tweets or blog posts, let alone the hundreds of daily articles and listicles and quizzes and the like. Longreads may offer stories that are best enjoyed away from your desk, but what good are such moments when the #longreads queue is so full? Like books bought to be shelved, articles are saved for a later that never comes.”

Exactly. And a little further on, Bogost adds,

“Spritzing is reading to get it over with. It is perhaps no accident that Spritze means injection in German. Like a medical procedure, reading has become an encumbrance that is as necessary as it is undesirable. “Oh God,” we think. “Another office email thread. Another timely tumblr. Another Atlantic article.” We want to read them—really to read them, to incorporate them—but the collective weight of so much content goes straight to the thighs and guts and asses of our souls. It’s too much to bear. Who wouldn’t want it to course right through, to pass unencumbered through eyeballs and neurons just to make way for the deluge behind it?”

bob-brown-reading-machine
Bob Brown’s Reading Machine

That paragraph eloquently articulates, better than I could, the concerns that motivated my earlier post. I have nothing to add to what Sedivy, Jacobs, and Bogost have already said about Spritz except to mention that I’m surprised no one, to my knowledge, has alluded to Bob Brown’s Readies. In his 1930 manifesto, Brown declared, “The written word hasn’t kept up with the age,” and he developed a mechanical reading device to meet that challenge. Brown’s reading machine, which you can read about here, was envisioned as an escape from the page, not unlike Spritz. But as Abigail Thomas puts it, “It is evident that through the materiality of the page acting as the imagined machine, that the reader becomes the machine themselves.” Of course, I wouldn’t know of Brown were it not that one of my grad school profs, Craig Saper, was deeply interested in Brown’s work.

That said, I do have one more thing to add. Spritz illustrates yet another temptation posed by modern technologies. We might call it the challenge of the treadmill. When I was in my early twenties and still in my more athletic phase, I took a stress test on a treadmill. The cardiologist told me to keep pace as long as I could, but, he added, “the treadmill always wins.” Of course, being modestly competitive and not a little prideful, I took that as a challenge. I ran hard on that machine, but, no surprise, the treadmill won.

So much of our response to the quickening pace induced by modern technologies is to quicken our own stride in response or to find other technologies that will help us do things more quickly, more efficiently. But again, the treadmill always wins. Maybe the answer to the challenge of the treadmill is simply to get off the thing.

But that decision doesn’t come easily for us. We have a hard time acknowledging our limitations. In fact, so much of the rhetoric surrounding technology in the western tradition involves precisely the promise of transcending our bodily limitations. Exhibit A, of course, is the transhumanist project.

In response, however, I submit the more humane vision of the agrarian and poet Wendell Berry. In “Faustian Economics,” Berry, speaking of the “fantasy of human limitlessness” that animates so much of our political and economic life, reminds us that we are “coming under pressure to understand ourselves as limited creatures in a limited world.” But this, he adds, should not be cause for despair:

“[O]ur human and earthly limits, properly understood, are not confinements but rather inducements to formal elaboration and elegance, to fullness of relationship and meaning. Perhaps our most serious cultural loss in recent centuries is the knowledge that some things, though limited, are inexhaustible. For example, an ecosystem, even that of a working forest or farm, so long as it remains ecologically intact, is inexhaustible. A small place, as I know from my own experience, can provide opportunities of work and learning, and a fund of beauty, solace, and pleasure — in addition to its difficulties — that cannot be exhausted in a lifetime or in generations.”

I would suggest that Berry’s wisdom is just as applicable to the realm of reading and the intellectual life as it is to our economic life.

With regards to reading, the first step may be coming to the realization, once again perhaps, that we cannot read it all. According to Joseph Epstein, “Gertrude Stein said that the happiest moment of her life was that moment in which she realized that she wouldn’t be able to read all the books in the world.” May Stein be our model, although I admit the happiness on this score is sometimes hard to muster.

I’ll leave you with two questions to consider. The first is from Len Kendall: “Is your day composed of reading 10% of 100 articles or 100% of 10 articles?”

The second is a set of related questions from Adam Gurri:

“Ask yourself: what conversations matter to you? Which are relevant to your life, and which are relevant to your interests? After figuring that out, be stricter about excluding stories that fall outside of those conversations. Be selective about the publications you read regularly, and seek to go deeper rather than broader in the conversations you follow.”

 

Why Do We Read?

I sat down, as I usually do on Friday mornings, to catch up on the week’s online reading. Unlike most week’s, though, I had let my RSS feed get backed up until it had become an unwieldy mess, and, consequently, I was faced with the prospect of combing through hundreds of items or … deleting them all. Following my own advice from a few months back, I marked all as read. “Are you sure you would like to mark all items as read?” I was politely asked. Yes, yes I am.

There was, of course, a moment of hesitation. What might I miss? Was there a really brilliant piece that I might not otherwise see? Was there fodder for a blog post? Something that would fit nicely with my dissertation research? Perspectives that I needed to read in order to remain “informed”?

Maybe, maybe not. The better question that eventually came to mind was simpler and of greater significance: What was I reading for?

young_girl_readingThat old venerable guide, Adler and van Doren’s How to Read a Book, suggests three possible answers to that question: one may read for information, understanding, or pleasure. Heuristically speaking, that’s not a bad start. But it doesn’t work very well in digital contexts, nor, for that matter, in pre-print contexts either.

The idea of reading for information and understanding with pleasure thrown in “somewhat apologetically,” as Alan Jacobs puts it in The Pleasures of Reading in an Age of Distractionhas a certain Age of Reason feel to it. Consider, by contrast, how the monk and scholar, Hugh of St. Victor, framed the aims of reading in the twelfth century. Hugh wrote what might be the earliest guide to reading, the Didascalicon, and quite early in the life of this blog I posted a series of excerpts from Ivan Illich’s wonderful study of Hugh and twelfth century reading technology.

According to Illich, for Hugh “the reader is one who has made himself into an exile in order to concentrate his entire attention and desire on wisdom, which thus becomes the hoped-for home.” But wisdom for Hugh was not merely knowledge applied to living well. “As with Augustine,” Illich explains,

“wisdom was for Hugh not something but someone. Wisdom in the Augustinian tradition is the second person of the Trinity, Christ . . . . The wisdom Hugh seeks is Christ himself.  Learning and, specifically, reading, are both simply forms of a search for Christ the Remedy, Christ the Example and Form which fallen humanity, which has lost it, hopes to recover. The need of fallen humanity for reunion with wisdom is central to Hugh’s thought.”

Of course, it is not all that surprising to find that, historically, the act of reading has been incorporated into larger cultural frameworks of meaning and purpose. Hugh’s understanding of reading was decidedly theological. Adler and van Doren’s vision for reading we might call democratic, both in the sense that they wanted the benefits of reading to be widely distributed and in the sense that such reading was supposed to cultivate responsible, informed citizens.

It is worth noting that these cultural shifts were not independent of developments in the available technologies of reading. For instance, Illich’s In the Vineyard of the Text is concerned with understanding how twelfth century developments in the apparatus of the book and the lay out of the page were already sundering spiritual reading from scholarly reading, leading to the emergence of the university. And, of course, the printing press and, perhaps even more so, the later availability of cheap paper were pre-conditions for the age of democratic reading.

Illich, writing as the digital age was dawning, understood that “the thought of an ultimate goal of all readings is not meaningful to us.” Our motives for reading are diverse and varied, and I would hesitate to reach for a generalization that would characterize the motives of our age the way we might speak of the theological reading of the middle ages or the democratic reading of the more recent age of print. But there are a few more modest observations that we might make.

My own online reading experience is too often characterized by what Jacobs has called reading “to have read.” In the context of his discussion, Jacobs was referring to those who dutifully read through a list of “must read” books or a list of “great works” merely out of a sense of duty. In other words, they were not driven to read by the intrinsic pleasure of reading, rather they read as one does chores around the house, to be done with them. Or worse, they read in order to be the sort of people who can say that they have read certain works and cash in whatever cultural currency that may earn them.

477px-Saint_Augustine_by_Philippe_de_ChampaigneI find that the phrase “reading to have read” covers a lot of the kind of reading I end up doing online. There is some vague sense that there are things I need to keep up with, things others are talking about that I should look into, things that by the very fact of their piling up in my RSS feed are asking to be read and it is a relief to go through them just as it is to get the Inbox down to zero. So on and so forth; this is nothing new.

Additionally, we might also note a variation on the theme, “reading to be seen to have read.” This kind of reading is not a function of digital texts per se, as much as it is a function of networked reading environments. This sort of reading gets sucked in to the construction of what Rob Horning has recently called the “post-authentic viral self.” You would do well to read the whole of Horning’s essay, but the gist of it for the purposes of this post is that the viral self reads in search of what will fuel the sharing metrics by which it registers its state of being. I am retweeted, therefore I am.

“One adopts a ‘viral self,’ anchored in continual demonstrations of its reach, based on ingenious appropriation and aggregation of existing content, not,” Horning explains, “in its fidelity to a static inner truth or set of tastes. It is defined by its ability to circulate, not by the content of what it circulates.”

In a much shorter, less sophisticated, but similarly insightful post, Len Kendall, sums up the motives driving the reading the viral self undertakes: “Today we’re driven less by the words on a page (or screen) inspiring thoughts in our minds, and more by how a title or topic trigger other people to validate, praise, and fight us.”

The reading of the viral self is reading in search of what can be shared, and this need not imply actual reading. Recall the joke NPR pulled off for April Fool’s Day earlier this year. They posted an article to their Facebook page titled, “Why Doesn’t America Read Anymore?” The article generated hundreds of comments and was shared thousands of time. There was, of course, no article. Those who clicked to read the piece were immediately informed that the article was a hoax based on NPR’s sense that more than a few folks were commenting on their stories without actually taking the time to read the stories.

Of course, there is nothing particularly novel about the relationship between reading and the “self” and its presentation. There is only the question of the nature of that relationship. In the age of print, we also read, in part, so that we might be seen reading. For the bookish sorts, a bookshelf could be a kind of self-portrait; aspirational perhaps, but a self-portrait nonetheless. Even in the distant world of Hugh of St. Victor, reading and identity were intertwined.

“That which we mean today when, in ordinary conversation, we speak of the ‘self’ or the ‘individual,’ is one of the great discoveries of the twelfth century,” according to Illich.  Hugh of St. Victor, Illich continues, “wants the reader to face the page so that by the light of wisdom he shall discover his self in the mirror of the parchment.  In the page the reader will acknowledge himself not in the way others see him or by the titles or nicknames by which they call him, but by knowing himself by sight.”

All of this began with a simple straightforward question: why do I read?

My answer will not be your answer, of course. In fact, it may be that neither of us have a very good answer to that question at all, or we may find that our answer to that question evolves over time.

The point, I think, is to occasionally ask the question.

The World of Tomorrow 75 Years Later

April 30th will mark the 75th anniversary of the opening of the 1939 New York World’s Fair. With a decade of Depression behind them and a world war looming ahead, 44 million visitors came to catch a hopeful glimpse of the future. The essay below, an earlier version of which first appeared on this site two years ago, explores the convergence of technology, utopian aspirations, and corporate power that animated the vision of the future visitors encountered 75 years ago. 

__________________________________________

1939 World's Fair Progress“The World of Tomorrow”—that was the theme of the 1939 NewYork world’s fair. Prior to 1939, the American fairs had been characterized by what historian Astrid Böger has aptly called a “bifocal nature.” Janus-faced, they looked back to a glorified past and forward to an idealized future.The fairs were both “patriotic commemorations of central events in American history” and they “envisioned the nation’s bright future.” During 1930’s, however, the fairs turned their gaze decidedly toward the future.

The ’39 New York fair offered an especially grandiose and compelling glimpse of a techno-utopian society poised to materialize within a generation. Its most popular exhibits featured Cities of Tomorrow—Zions that were to be realized through technological expertise deployed by corporate power and supported by benign government planning. And little wonder these exhibits were so popular: the nation had been through a decade of economic depression and rumors of war swept across the Atlantic. “To catch the public imagination,” historian David Nye has explained, “the fair had to address this uneasiness. It could not do so by mere appeals to patriotism, by displays of goods that many people had no money to buy, or by the nostalgic evocation of golden yesterdays. It had to offer temporary transcendence.”

This link between technology and the realization of religiously intoned utopian visions did not, however, appear out of nowhere in the 1930s. In fact, the late cultural historian David Noble has argued convincingly that this religiously inspired techno-utopianism has been integral to the Western scientific project since at least the late middle ages; it was the central tenet of faith for what Noble called the “religion of technology.”

The planners of the 1939 fair instructed the industrial designers, who “looked not with the pragmatic eye of the engineer but with the visionary gaze of the utopian,” to weave technology throughout the fabric of the whole fair. In previous fairs and expositions, science had occupied a prominent but localized place among the multiple exhibits and attractions. The ‘39 fair intentionally broke with this tradition. As world’s fair historian Robert Rydell put it, “Instead of building a central shrine to house scientific displays,” the designers decided “to saturate the fair with the gospel of scientific idealism.” With nearly a decade of economic depression behind them and a looming international conflagration before them, the fair planners remained committed to the religion of technology and they were intent on creating a fair that would rekindle America’s waning faith. It may not be entirely inappropriate, then, to see the 1939 New York world’s fair as a revival meeting calling the faithful to renewed hope in the religion of technology. But the call to renewed faith in 1939 also contained variations on the theme. The presentation of the religion of technology took a liturgical turn and it was alloyed with the spirit of the American corporation.

GM Building designed Albert Kahn and Norman Bel Geddes
GM Building designed Albert Kahn and Norman Bel Geddes

Ritual Fairs

Historians of the world’s fair typically focus on the explicit message fair designers intended to communicate. They have studied the fairs as texts laid out for analysis. But it’s debatable whether this tells us much about the experience of fairgoers. Böger suggests a better way of understanding how the fairs made their impression. “World’s fairs,” she tells us, “are performative events in that they present a vision of national culture in the form of spectacle, which visitors are invited to participate in and, thus, help create.” Writing of the Ferris Wheel at the 1893 Columbian Exposition in Chicago, Böger explained that it was the “striking example of the sensual–primarily visual–experience of the fair, which seems to precede both understanding of the exhibit’s technology and, more importantly, appreciation of it as an American achievement.” What Böger picks up on in these observations is the distinction between the fair’s intellectual content and the embodied experience of attending the fair. It is the difference between reading the fairs as a “text” with an explicit message and constructing a meaning through the experience of “taking in” the fair. The planners intended an intellectualized, chiefly cognitive experience. Fairgoers processed the fair in an embodied and mostly affective manner. It is this distinction that leads to the observation that the religion of technology, as it appeared at the ‘39 fair, was a liturgical religion.

1939 World's FairThe genius of the two most popular exhibits at the fair was the embodiment of their message in a ritual experience. Democracity, housed inside the Perisphere, and General Motors’ Futurama both solved the problem of the impertinent walkers by miniaturizing the idealized world and carefully choreographing the fairgoer’s experience. Earlier fairs presented themselves as idealized cities, but this risked the diffusion of the message as fairgoer’s crafted their own fair itineraries or otherwise remained oblivious to the implicit messages. Democracity and Futurama mitigated this risk by crafting not only the world, but the experience itself–by providing a liturgy for the ritual. And the ritual was decidedly aimed at the cultivation of hope in a future techno-utopian society, giving ritual expression to the religion of technology.

As David Nye observed, “the most successful [exhibits] were those that took the form of dramas with covertly religious overtones.” In fact, Nye describes the whole fair as “a quasi-religious experience of escape into an ideal future equally accessible to all … The fair was a shrine of modernity.” Nowhere was the “quasi-religious” aspect of the fair more clearly evident than in Democracity, the miniature city of the future housed within the fair’s iconic Perisphere.

Fairgoers filed into the sphere and were able to gaze down upon the city of the future from two balconies. When the five-and-a-half minute show began, the narrator described the idealized miniature landscape featuring the city of the future at its center. Emanating outward from the central city were towns and farm country. The towns would each be devoted to specific industries, and they would be home to both workers and management. As the show progressed and the narrator extolled the virtues of central planning, the lighting in the sphere simulated the passage of day and night. Nye summarizes what followed:

“Once the visitors had contemplated this future world, they were presented with a powerful vision that one commentator compared to ‘a secular apocalypse.’ Now the lights of the city dimmed. To create a devotional mood, a thousand-voice choir sang on a recording that André Kostelanetz had prepared for the display. Movies projected on the upper walls of the globe showed representatives of various professions working, marching, and singing together. The authoritative voice of the radio announcer H. V. Kaltenborn announced: ‘This march of men and women, singing their triumph, is the true symbol of the World of Tomorrow.’”

What they sang was the theme song of the fair that proclaimed:

“We’re the rising tide coming from far and wide
Marching side by side on our way
For a brave new world,
That we shall build today.”

Kihlstedt believes Democracity’s designer, Henry Dreyfuss, modeled this culminating scene on Dutch Renaissance artist Jan Van Eyck’s Ghent Altarpiece featuring “a great multitude … of all nations and kindreds, and people” as described in the book of Revelation. “In this well-known painting,” Kihlstedt explains, “the saints converge toward the altar of the Lamb from the four corners of the world. As they reveal the unity and the ‘ultimate beatitude of all believing souls,’ these saints define by their presence a heaven on earth.” Ritual and interpretation were thus fused together in one visceral, affective liturgy.

Democracity, inside the Perisphere

Corporate Liturgies

Earlier fairs were driven by a variety of ideologies. Robert Rydell, arguably the leading historian of world’s fairs, has emphasized the imperial and racial ideologies driving the design of the Victorian Era fairs. These fairs also promoted political ideals and patriotism. Additionally, they sought to educate the public in the latest scientific trends (dubious as they may have been, as in the case of Social Darwinism for instance). But in the 1930s the emphasis shifted decidedly. Böger notes, for example, “the early American expositions have to be placed in the context of nationalism and imperialism, whereas the world’s fairs after 1915 went in the direction of globalism and the ensuing competition of opposing ideological systems rather than of individual nation states.” More specifically the fairs of the 1930s, and the 1939 fair especially, sought to buttress the legitimacy of democracy and the free market in the face of totalitarian and socialist alternatives.

“From the beginning,” Rydell observed, “the century-of-progress expositions were conceived as festivals of American corporate power that would put breathtaking amounts of surplus capital to work in the field of cultural production and ideological representation.” Kihlstedt put it this way: “whereas most nineteenth-century utopias were socialist, based on cooperative production and distribution of goods, the twentieth-century fairs suggested that utopia would be attained through corporate capitalism and the individual freedom associated with it.” He added, “the organizers of the NYWF were making quasi-propagandistic use of utopian ideas and imagery to equate utopia with capitalism.” For his part, Nye drew on Roland Marchand to connect the evolution of the world’s fairs with the development of corporate marketing strategies: “corporations first tried only to sell products, then tried to educate the public about their business, and finally turned to marketing visions of the future.” Nye also tied the ritual nature of the fairs with the corporate turn: “Such exhibits might be compared to the sacred places of tribal societies … Each inscribed cultural meanings in ritual … And who but the corporations took the role of the ritual elders in making possible such a reassuring future, in exchange for submission.”

In this way, the religion of technology was effectively incorporated. American corporations presented themselves as the builders of the techno-utopian city. With the cooperation of government agencies, corporations would wield the breathtaking power of technology to create a rationally planned yet democratic consumer society. Thus was the religion of technology enlisted by the marketing departments of American corporations.

Framing the 1939 New York World’s fair as an embodiment of the religion of technology highlights the convergence of technology, utopian aspirations, and corporate power at this pivotal cultural moment in American history. This convergence was taking shape before 1939, but at the New York fair it announced itself in memorable and compelling fashion. Through its imaginative liturgical experience, the fair renewed the faith of a generation of Americans in the religion of technology, and it was this generation that went on to build post-war American society.