Promethean Shocks

Consider the image below. It was created in 1952 by Alexander Leydenfrost for the 50th anniversary issue of Popular Mechanics.

Alexander Leydenfrost - March of Science 3WifC

I thought of this image as I read Thomas Misa’s brief discussion of the wide-spread perception that the pace of technological change is ever-quickening. “At least since Alvin Toffler’s best-selling Future Shock (1970),” Misa writes, “pundits perennially declare that the pace of technology is somehow quickening and that technology is forcing cultural changes in its wake, that our plunge into the future is driven by technology gone out of control.”

Misa, a historian of technology, is not altogether certain that the pace of technological change has in fact quickened. He is certainly opposed to the “crude technological determinism” inherent in the idea that technology is forcing cultural changes. He does, however, give merit to the experience that is often described using this language. He attributes the perception of quickening to “a split between our normative expectations for technology and what we observe in the world around us.” “It is not so much that our technologies are changing especially quickly,” he explains, “but that our sense of what is ‘normal,’ about technology and society, cannot keep pace.”

According to Misa, developments in cloning, biotechnology, surveillance technologies, and nanotechnology are outstripping regulatory laws and ethical norms. This seems true enough. We might even describe the condition of modernity (and/or post-modernity, or hyper-modernity, or reflexive modernity, or whatever) as one of perpetual Promethean shock. This way of putting it seems more apt than “future shock.” One way of telling the story of modernity, after all, is to order it as a series of startling realizations of what we suddenly acquired the power to do. As Auden wrote of the modern Mind,

“Though instruments at Its command
Make wish and counterwish come true,
It clearly cannot understand
What It can clearly do.”

Clearly certain technological innovations yield this dizzying experience of Promethean shock more than others. The advent of human flight, the atomic bomb, and putting a man on the moon are just some of the more simultaneously startling, awe-inspiring, disconcerting examples. When these technologies arrived, in rather quick succession, they rattled and unsettled reigning cultural and moral frameworks, tacit as they may have been, for incorporating new technologies into existing society.

These normative cultural expectations for technology were set, Misa suggests, during the “longer-duration ‘eras'” of technology which he identifies in his history of the relationship between technology and culture. These eras included, for example, the age of the Renaissance, when the demands of Renaissance court culture set the agenda for technological change, and the age of imperialism, when the imperatives of empire dictated the dominant agenda. The former era, in Misa’s analysis, spanned nearly two centuries, and the latter the better part of one; but the 20th century is home to at least four different eras, which Misa labels the eras of systems, modernism, war, and global culture respectively.

Misa is on to something here, but he seems to push the question back rather than answering it directly. Why then are these eras, tentative and suggestive as he acknowledges them to be, getting progressively shorter? He is closer to the mark, I think, when he also suggests that the sense of quickening is linked to another “quickening” in  the “self-awareness of societies,” which he further defines as “our capacities to recognize and comprehend change.”

His brief illustration of this development is initially compelling. Two centuries elapsed before the Renaissance was named. Within 50 years of the appearance of factories in Manchester, the phrase industrial revolution was in usage. Just four years after the telephone arrived in Moscow, Chekhov published a short story, “On the Telephone,” about the travails of an early adopter trying to make a restaurant reservation over the phone (fodder for a Seinfeld plot, if you ask me). Most recently, William Gibson gives us the term cyberspace “almost before the fact of cyberspace.” This sequences suggests to Misa that Western society has become increasingly self-aware of technological change and its consequences.

I think this claim has merit. The Modern mind that, according to Auden “cannot understand/what it can clearly do” he described as the “self-observed, observing Mind.”  But, again, we might be tempted to ask why our “self-awareness” is accelerating in this way. Might it not be attributed, at least in part, to an increase in the rapidity of technological development? The two seem inseparable. I’m reminded of Walter Ong’s dictum about writing: “Writing heightens consciousness.” In a slightly different way, might we not argue that technological change heightens societal self-awareness?

Consider again that Leydenfrost image above. It captures an important aspect of how we’ve come to understand our world: we have aligned our reckoning of the passage of time with the development of technology. We have technologies that mark time, but in another sense the advent of new technologies mark time by their appearance, iteration, and obsolescence.

Human beings have long sought markers to organize the experience of time, of course. For a day sunrise and sunset served just fine. The seasons, too, helped order the experience of a year. For longer periods, however, cultural rather than natural markers were needed. Consider, for instance, the common practice in the ancient world of reckoning time by the reigns of monarchs. “In the x year of so-and-so’s reign” is a recurring temporal formula.

Without achieving that sort of verbal formality or precision, I’d suggest that the development of technology–the reign, if you will, of certain technologies and artifacts–now does similar work for modern societies. Records, 8-tracks, tapes, CDs, MP3s. Desktops, laptops, tablets. Landline, portable landline, cellphone, smartphone. Black and white TV, color TV, projection TV, flatscreen TV. Pre-Internet/post-Internet. Dial-up/broadband/wireless. I suspect you can supply similar artifactual chronologies that have structured our recollection of the past. We seem to have synchronized our perception and experience of time to the cycles of technological innovation.

The Leydenfrost image also reminds us that insofar as the notion of progress exists at all today, it is clearly bound up with the advance of technology. All other forms of progress that we might imagine  or aspire to– moral, economic, social–these are subsumed under the notion of technological progress. For that reason, rumors or suggestions that technological innovation might be slowing down unnerve us. We need the next big thing  to keep coming on schedule, however trivial that next big thing might be, to distract us from our economic, political, and personal woes.

This was illustrated nicely by the minor tech-media freakout occasioned recently by a Christopher Mims’ piece at Quartz arguing that 2013 was a lost year for tech. It was received as a heretical claim, and it was promptly and roundly condemned. “I, too, constantly yearn for mind-blowing new tech,” Farhad Manjoo tells us in his reply to Mims before re-assuring us: “I think we’re witnessing the dawn of a new paradigm in machine-human cooperation: Combining machine intelligence with biological intelligence will always trump one or the other. Machines make us better, and we make machines better. There’s still hope for us. Welcome to the bionic future.”

The world may be falling apart around us, but we can bear it so long as we can project our hopes on the amorphous promise of technological advance. The prospect of a host of Promethean shocks that we seemed poised to receive–from drones, robotics, AI, bioengineer, geo-engineering, nanotechnology–makes us nervous, they unsettle our moral frameworks; but their absence would worry us more, I suspect.

 

The Triumph of Time

From philosopher Edward S. Casey’s Getting Back Into Place:

“‘Time will tell’: so we say, and so we believe, in the modern era. This era, extending from Galileo and Descartes to Husserl and Heidegger, belongs to time, the time when Time came into its own. Finally. And yet, despite the fact that history, human and geological alike, occurs in time (and has been doing so for quite a while), time itself has not long been singled out for separate scrutiny …. By the middle of the eighteenth century, time had become prior to space in metaphysics as well as in physics. If, in Leibniz’s elegant formula, ‘space is the order of co-existing things,’ time proved to be more basic: as ‘the order of successive things,’ it alone encompasses the physical world order. By the moment when Kant could assert this, time had won primacy over space. We have been living off this legacy ever since, not in only philosophy and physics but in our daily lives as well.

These lives are grasped and ordered in terms of time. Scheduled and overscheduled, we look to the clock or the calendar for guidance and solace, even judgment! But such time-telling offers precious little guidance, no solace whatsoever, and a predominantly negative judgment (‘it’s too late now’) … We are lost because our conviction that time, not only the world’s time but our time, the only time we have, is always running down.”

Casey’s project may be described as a phenomenologically inflected recovery of place. But he begins by describing the triumph of time. Tell me whether this does not resonate deeply with your experience:

“The pandemic obsession with time from which so many moderns have suffered — and from which so many postmoderns still suffer — is exacerbated by the vertiginous sense that time and the world-order, together constituting the terra firma of modernist solidity, are subject to dissolution. Not surprisingly, we objectify time and pay handsome rewards … to those who can tie time down in improved chronometry. Although, the modern period has succeeded brilliantly in this very regard, it has also fallen into the schizoid state of having made objective, as clock-time and world-time, what is in fact most diaphanous and ephemeral, most ‘obscure’ in human experience. We end by obsessing about what is no object at all. We feel obligated to tell time in an objective manner; but in fact we have only obliged ourselves to do so by our own sub rosa subreptions, becoming thereby our own pawns in the losing game of time.”

Twitter Time

“Twitter relies on people’s desire to be the same.” At least that’s what A. C. Goodall claims in a recent New Statesman article, “Is Twitter the Enemy of Self-Expression?”  This is, it would seem, a rather vague and unsubstantiated claim.  In his brief comments, Alan Jacobs writes that Goodall’s piece amounts to “assertions without evidence.”  Jacobs goes on to argue that it is unhelpful to make sweeping claims about something like Twitter which is “a platform and a medium,” rather than an organized, coherent unit with an integral “character.”  A medium or platform is subject to countless implementations by users, and, as the history of technology has shown, these uses are often surprising and unexpected.

On the whole I’m sympathetic to Jacobs comments.  His main point echoes Michel de Certeau’s insistence that we pay close attention to the use that consumers make of products.  In his time, the critical focus had fallen on the products and producers; consumers were tacitly assumed to be passive and docile recipients/victims of the powers of production.  De Certeau made it a point, especially in The Practice of Everyday Life, to throw light on the multifarious, and often impertinent, uses to which consumers put products.  Likewise, Jacobs is reminding us that generalizations about a medium can be misleading and unhelpful because users put any medium to widely disparate ends.

This is a fair point.  However (and if there weren’t a “however” I wouldn’t be writing this), I’m a bit of a recalcitrant McLuhanist and tend to think that the medium may have its influence regardless of the uses to which it is put.  And perhaps, I might better label myself an Aristotelian McLuhanist, which is to say that I’m tending toward localizing the impact of a medium in the realm of habit and inclination.  The use of a medium over time creates certain habits of mind and body.  These habits of mind and body together yield, in my own way of using this language, a habituated sensibility.  The difficulty this influence poses to critique is that, precisely because it is habituated, it tends to operate below the level of conscious awareness.

I don’t think the focus on use and the attention to the effects of a medium are necessarily mutually exclusive.  Habits after all are only formed through significant and repeated use.   Perhaps they are two axes of a grid on which the impact of technology may be plotted. In any case, it would help to provide an example.

Consider our experience of time.  It seems that the human experience of time, how we sense and process the passage of time, is not a fixed variable of human nature.  My sense is that we habituate ourselves to a certain experience of time and it is difficult to immediately adjust to another mode.  Consider those rare moments when we find ourselves having nothing to do.  How often do we then report that we were unable to just relax; we had the urge to do something, anything.  We were restless precisely at the moment when we could have taken a rest.   Or, at a wider scale, consider the various ways cultures approach time.  We tend to naturalize the Western habits of precise time keeping and partitioning until we enter another culture which operates by a very different set of attitudes toward time.  It would take something much longer than a blog post to explore this fully, but it would seem plausible that certain technologies — some, like the mechanical clock, very old — mold our experience of time.

Bernard Stiegler has commented along similar lines on the media environment and consequent experience of time fostered by television.  To begin with he notes, going back to the establishment of the first press agency in Paris in 1835 near a new telegraph, that the “value of information as commodity drops precipitously with time …”  He goes on to describe industrial time in the following context:

“…. an event becomes an event — it literally takes place — only in being ‘covered.’  Industrial time is always at least coproduced by the media.  ‘Coverage’ — what is to be covered — is determined by criteria oriented toward producing surplus value.  Mass broadcasting is a machine to produce ready-made ideas, ‘cliches.’  Information must be ‘fresh’ and this explains why the ideal for all news organs is the elimination of delay in transmission time.”

To be sure, more than the logic of the medium is at play here, but it may be difficult and beside the point to parse out the logic of the medium from other factors.

The ability to eliminate  of the delay between event and transmission that characterized industrial time has been radically democratized by digital media.  We are all operating under these conditions now.  You may vaguely remember, by contrast, the time that elapsed between snapping a picture, getting it developed, and finally showing it to others.  That time has been collapse, not only for large news organizations, but for anyone with an internet enabled smart phone.  In the interest of creating catchy labels, perhaps we may call this, not industrial time, but Twitter time.  “Twitter” here is just a synecdoche for the ability to immediately capture and broadcast information, an ability that is now widely available.  My guess is that this capacity, admittedly used in various ways, will affect the sensibility that we label our “experience of time.”

Stiegler continues (with my apologies for subjecting you to the rather dense prose):

“With an effect of the real (of presence) resulting from the coincidence of the event and its seizure and with the real-time or ‘live’ transmission resulting from the coincidence of the event and its reception, a new experience of time, collective as well as individual, emerges.  This new time betokens an exit from the properly historical epoch, insofar as the latter is defined by an essentially deferred time — that is, by a constitutive opposition, posited in principle, between the narrative and that which is narrated.  This is why Pierre Nora can claim that the speed of transmission of analog and digital transmissions promotes ‘the immediate to historical status’:

‘Landing on the moon was the model of the modern event.  Its condition remained live retransmission by Telstar . . . . What is proper to the modern event is that it implies an immediately public scene, always accompanied by the reporter-spectator, who sees events taking place.  This ‘voyeurism’ gives to current events both their specificity with regard to history and their already historical feel as immediately out of the past.’

There is a lot to unpack in all of that.  We are all reporter-spectators now.  Deferred time, time between event and narration, is eclipsed. Everything is immediately “out of the past,” or, at least as I understand it, the whole of the past is collapsed into a moment that is not now.  The earthquake and tsunami in Japan, just two months past, might as well have taken place five years ago.  The killing of bin Laden, likewise, will very soon appear to be buried in the indiscriminate past.

Twitter as a medium, used to the point of fostering a habituated sensibility (but regardless of particularized uses), would seem to accelerate this economy of time and expand its province into private life.  It doesn’t create this economy of time, but it does heighten and reinforce its trajectory.  In fact, the relentless flow of the Twitter “timeline” (not an insignificant designation), or better, our effort to keep up with it and make sense of it, may be an apt metaphor for our overall experience of time.

All of this to say that while a medium or platform can be used variously and flexibly, it is not infinitely malleable; a certain underlying logic is more or less fixed and this logic has its own consequences.  Of course, none of this necessarily amounts to saying Twitter is “bad”, only to note that its use can have consequences.

Speaking of habit, I’m curious if anyone felt the urge to click the “1 New Tweet” image?