Illuminating Contrasts

Patrick Leigh Fermor was widely regarded to be the best travel writer of the last century. He lived what was by all accounts a remarkably full life that included, to mention just two of the more striking episodes, a dalliance with a Romanian princess and the successful kidnapping of a German general in occupied Crete. Leigh Fermor struck up a lasting friendship with that same general when, as if his life were an Errol Flynn film, he completed a line from Horace in Latin begun by the German in the midst of his capture.

Among Leigh Fermor’s storied travels, there was a period of several months in 1957 spent as a guest in a series of monasteries while he worked on his writing. He tells of his time in the monasteries in a small book, considered by many to be his best, A Time to Keep Silence.

Leigh Fermor began with a visit to the Abbey of St. Wandrille de Fontanelle in the northwest of France. While there, he experienced a profound recalibration of the rhythm and pace of life. Of his first days he writes:

“My first feelings in the monastery changed: I lost the sensation of circumambient and impending death, of being by mistake locked up in a catacomb. I think the alteration must have taken about four days. The mood of dereliction persisted some time, a feeling of loneliness and flatness that always accompanies the transition from urban excess to a life of rustic solitude. Here in the Abbey, in absolutely unfamiliar surroundings, this miserable bridge-passage was immensely widened.”

We have vague ideas about the monastic life, when we think of it at all, and we know that it must be quite different from our own, but, Leigh Fermor insists, “only by living for a while in a monastery can one quite grasp its staggering difference from the ordinary life that we lead.”

In his view, “The two ways of life do not share a single attribute; and the thoughts, ambitions, sounds, light, time and mood that surround the inhabitants of a cloister are not only unlike anything to which one is accustomed, but in some curious way, seem its exact reverse. The period during which normal standards recede and the strange new world becomes reality is slow, and, at first, acutely painful.”

“To begin with,” he continues, “I slept badly at night and fell asleep during the day, felt restless alone in my cell and depressed by the lack of alcohol, the disappearance of which had caused a sudden halt in the customary monsoon.”

He then goes on to explain a remarkable alteration in his circadian rhythm. It’s worth quoting him at length at this point. Consider this carefully:

“The most remarkable preliminary symptoms were the variations of my need of sleep. After initial spells of insomnia, nightmare and falling asleep by day, I found that my capacity for sleep was becoming more and more remarkable: till the hours I spent in or on my bed vastly outnumbered the hours I spent awake; and my sleep was so profound that I might have been under the influence of some hypnotic drug. For two days, meals and the offices in the church — Mass, Vespers and Compline — were almost my only lucid moments. Then began an extraordinary transformation: this extreme lassitude dwindled to nothing; night shrank to five hours of light, dreamless and perfect sleep, followed by awakenings full of energy and limpid freshness. The explanation is simple enough: the desire for talk, movements and nervous expression that I had transported from Paris found, in this silent place, no response or foil, evoked no single echo; after miserably gesticulating for a while in a vacuum, it languished and finally died for lack of any stimulus or nourishment. Then the tremendous accumulation of tiredness, which must be the common property of all our contemporaries, broke loose and swamped everything. No demands, once I had emerged from that flood of sleep, were made upon my nervous energy: there were no automatic drains, such as conversation at meals, small talk, catching trains, or the hundred anxious trivialities that poison everyday life. Even the major causes of guilt and anxiety had slid away into some distant limbo and not only failed to emerge in the small hours as tormentors but appeared to have lost their dragonish validity.”

Reading Leigh Fermor’s account of his transition from the rhythms of ordinary life to the rhythms of monastic life reminded me of the testimonials one hears every so often from those who have undertaken a digital fast, a technology Sabbath, or a digital detox. There is, after all, something rather ascetic about most of that terminology. I was especially reminded of a 2010 story about five neuroscientists that made an experiment of a week-long rafting trip in Utah. David Strayer, one of the scientists in the party, coined the term “third-day syndrome” to describe  the shifts in attitude and behavior that begin to manifest themselves after three days of being “unplugged” — quite near the time Leigh Fermor suggests it took him to acclimate to life in the monastery, about four days.

One of the neuroscientists added, “Time is slowing down.” Another wondered out loud: “If we can find out that people are walking around fatigued and not realizing their cognitive potential … What can we do to get us back to our full potential?”

Leigh Fermor put this point more eloquently when he described “the tremendous accumulation of tiredness, which must be the common property of all our contemporaries.” Remember, he was writing in the late 1950s.

Whatever we think of the peculiar forms of fatigue that might be associated with the character of “digital life,” Leigh Fermor’s account of his journey into the silence of the monastic houses more than half a century ago reminds us that, if our bodies and minds are to be trusted, there has been something amiss with the way we order our lives since long before the advent of the Internet and smartphones.

I write that with some trepidation. Too often the identification of some historical precedent or antecedent to a modern malaise leads some to conclude something like, “Well, you see people in the past have felt this too and they survived, so this must not really be a problem.” There ought to be a name for that sort of fallacy. Perhaps there is one and I’m unaware of it. I should, in fact, list the deployment of this line of reasoning as a symptoms of a Borg Complex. In any case, it is shallow solace.

I rather think that such comparisons point us not to non-problems, but to perennial problems that take historically specific form and that each generation must address for itself. Leigh Fermor’s monastic retreat provided a ground against which the figure of mid-twentieth century urban life came into sharper relief.

Spans of time spent materially disconnected from the Internet may serve a similar function today. They may heighten our understanding of what is now the character of our ordinary life. It is its ordinariness, after all, that keeps us from seeing it clearly. Without a contrasting ground the figure does not appear at all, and the contrast may point out to us where we’ve been taken into patterns and habits that work against our well-being.

Writing, Academic and Otherwise

Passing through the process of academic professionalization is, in part, not unlike the process of learning a new language. It is, for example, the sort of process that might lead me to write discourse in place of language to conclude that first sentence. Learning this new language can be both an infuriating and exhilarating experience. At first, the new language mystifies, baffles, and frustrates; later, if one sticks to it and if this new language is not utter nonsense (as it may sometimes be), there is a certain thrill in being able to see and name previously unseen (because unnamed) and poorly understood dimensions of experience.

I suspect the younger one happens to be when this initiating process takes place, the more zealously one may take to this new language, allowing it to become the grid through which all experience is later comprehended. This is, on the whole, an unfortunate tendency. Another unfortunate tendency is that by which, over time, academics forget that theirs is a learned and often obscure language which they acquired only after months and possibly years of training. This is easily forgotten, perhaps because it is only metaphorically a new language. It is, if you are American, still English, but a peculiarly augmented (or deformed, depending on your perspective) form of the language.

This means, usually, that when academics (or academics in training) write, they write in a way that might not be easily assimilated by non-academics. This is, of course, entirely unrelated to intellect or ability (a point that is sometimes missed). A brilliant Spaniard, for instance, is no less brilliant for having never taken the time to learn Swahili. This is also a function of the tribal quality of academic life. One gets used to operating in the language of the tribe and sometimes forgets to adjust to accordingly when operating in other contexts.

Again, I think this is very often simply a matter of habit and forgetfulness, although, it is sometimes a matter of arrogance, self-importance, and other such traits of character.

I mention all of this because, if I were asked to verbalize why I write this blog, I would say that it was in part to translate the work of academics, critics, and theorists into a more accessible form so that their insights regarding the meaning and consequences of media and technology, so far as those insights were useful, might be more widely known. After all, the technologies I usually write about affect so many of us, academics and non-academics alike. Anyone who cares to think about how to navigate these technologies as wisely as possible should be able to encounter the best thinking on such matters in a reasonably accessible form. I don’t know, maybe there is a certain naïveté in that aspiration, but it seems worth pursuing.

I’m fairly certain, though, that I don’t always achieve this goal that I half-consciously maintain for what I do here. I’m writing this post mostly to remind myself of this aspiration and renew my commitment to it.

I should be clear, I’m talking neither about dumbing down what there is to know nor am I suggesting anything like condescension ought to be involved. The challenge is to maintain the depth of insight and to resist the over-simplification of complexity while at the same time avoiding the characteristics of academic language that tend to make it inaccessible. It’s a matter of not ignoring the non-academic reader while also taking them seriously.

I’m reminded of some comments that David Foster Wallace made regarding the purposes of literature. I’ve cited this passage before, quite some time ago, and it has stuck with me. It’s a bit long, but worth reading. Wallace is discussing literature with the interviewer, David Lipsky, and they are debating the relative merits of traditional literature and less traditional, more avant-garde writing:

Huh.  Well you and I just disagree.  Maybe the world just feels differently to us.  This is all going back to something that isn’t really clear:  that avant-garde stuff is hard to read.  I’m not defending it, I’m saying that stuff — this is gonna get very abstract — but there’s a certain set of magical stuff that fiction can do for us.  There’s maybe thirteen things, of which who even knows which ones we can talk about.  But one of them has to do with the sense of, the sense of capturing, capturing what the world feels like to us, in the sort of way that I think that a reader can tell “Another sensibility like mine exists.”  Something else feels this way to someone else.  So that the reader feels less lonely.

There’s really really shitty avant-garde, that’s coy and hard for its own sake.  That I don’t think it’s a big accident that a lot of what, if you look at the history of fiction — sort of, like, if you look at the history of painting after the development of the photograph — that the history of fiction represents this continuing struggle to allow fiction to continue to do that magical stuff.  As the texture, as the cognitive texture, of our lives changes.  And as, um, as the different media by which our lives are represented change.  And it’s the avant-garde or experimental stuff that has the chance to move the stuff along.  And that’s what’s precious about it.

And the reason why I’m angry at how shitty most of it is, and how much it ignores the reader, is that I think it’s very very very very precious.  Because it’s the stuff that’s about what it feels like to live.  Instead of being a relief from what it feels like to live.

Maybe it is ill-advised to make this comparison, but I think what Wallace has to say here, or at least the spirit of what he is saying can apply to academic work as well. It can also be a way of representing what it feels like to be alive. I tend to hold literature in rather high esteem, so I don’t think that non-fiction can really replicate the experience of more literary writing, but it can be useful in its own way. It can help make sense of experience. It can generate self-understanding. It can suggest new possibilities for how to make one’s way in the world.

It’s too late for new year’s resolutions, but I’m hoping to keep this goal more clearly in focus as I continue to write on here. You can tell me, if you’re so inclined, how well I manage. Cheers.

From Memory Scarcity to Memory Abundance

The most famous section in arguably the most famous book about photography, Roland Barthes’ Camera Lucida, dwells on a photograph of Barthes’ recently deceased mother taken in a winter garden when she was a little girl. On this picture, Barthes hung his meditative reflections on death and photography. The image evoked both the “that-has-been” reality of the subject, and the haunting “this-will-die” realization. That one photograph of his mother is also the only image discussed by Barthes that was not reproduced in Camera Lucida. It was too personal. It conveyed something true about his mother, but only to him.

But what if Barthes had not a few, but hundreds or even thousands of images of his mother?

I’ve long thought that what was most consequential about social media was their status as prosthetic memories. A site like Facebook, for example, is a massive archive of externalized memories preserved as texts and images. For this reason, it seemed to me, it would be unbearably hard to abandon such sites, particularly for those who had come of age with and through them. These archives bore too precious a record of the past to be simply deleted with a few clicks. I made this argument as late as last night.

But now I’ve realized that I had not fully appreciated the most important dynamic at play. I was operating with assumptions that were formed during an age of relative memory scarcity, but digital photography and sites like Facebook have brought us to an age of memory abundance. The paradoxical consequence of this development will be the progressive devaluing of such memories and severing of the past’s hold on the present. Gigabytes and terabytes of digital memories will not make us care more about those memories, they will make us care less.

We’ve seen the pattern before. Oral societies which had few and relatively inefficient technologies of remembrance at their disposal, lived to remember. Their cultural lives were devoted to ritual and liturgical acts of communal remembering. The introduction of writing, a comparably wondrous technology of remembrance, gradually released the individual from the burdens of cultural remembrance. Memory that could be outsourced, as we say, or offloaded could also be effectively forgotten by  the individual who was free to remember their own history. And it has been to this task that subsequent developments in the technology of remembrance have been put to use. The emergence of cheap paper coupled with rising rates of literacy gave us the diary and the boxes of letters. Photography and the film were also put to the task of documenting our lives. But until recently, these technologies were subject to important constraints. The recording devices were bulky and cumbersome and they were limited in capacity by the number of exposures in a film and the length of ribbon in a tape. There were also important practical constraints on storage and access. Digital technologies have burst through these constraints and they have not yet reached their potential.

Now we carry relatively unobtrusive devices of practically unlimited recording capacity, and these are easily linked to archives that are likewise virtually unlimited in their capacity to store and organize these memories. If we cast our vision into the not altogether distant nor fantastical future, we can anticipate individuals engaging with the world through devices (e.g., Google Glass) that will both augment the physical world by layering it with information and generate a near continuous audio-visual record of our experience.

Compared to these present and soon-to-be technologies, the 35mm camera which was at my disposal through the ’80s and ’90s seems primitive. With regards to a spectrum indicating the capacity to document and archive memories, I was then closer to my pre-modern predecessors than to the generation that will succeed me.

Roland Barthes’ near mystical veneration of his mother’s photograph, touching as it appears to those of us who lived in the age of memory scarcity, will seem quixotic and quaint to those who have known only memory abundance. Barthes will seem to them as those medievals that venerated the physical book do to us. They will be as indifferent to the photograph, and the past it encodes, as we are to the cheap paperback.

It may seem, as it did to me, that social media revived the significance of the past by reconnecting us with friends we would have mostly forgotten and reconstituting habits of social remembering. I’d even expressed concerns that social media might allow the past to overwhelm the present rendering recollection rather than suppression traumatic. But this has only been an effect of novelty upon that transitional generation who had lived without the technology and upon whom it appeared in medias res. For those who have known only the affordances of memory abundance, there will be no reconnection with long forgotten classmates or nostalgic reminiscences around a rare photograph of their youth capturing some trivial, unremembered moment. It will all be documented and archived, but it will mean not a thing.

It will be Barthes’ contemporary, Andy Warhol, who will appear as one of us. In his biography of Warhol, Victor Bockris writes,

Indeed, Andy’s desire to record everything around him had become a mania.  As John Perrault, the art critic, wrote in a profile of Warhol in Vogue:  “His portable tape recorder, housed in a black briefcase, is his latest self-protection device.  The microphone is pointed at anyone who approaches, turning the situation into a theater work.  He records hours of tape every day but just files the reels away and never listens to them.”

Andy Warhol’s performance art will be our ordinary experience, and it is that last line that we should note — “… and he never listens to them.”

Reconsider Plato’s infamous critique of writing. Critics charge Plato with shortsightedness because he failed to see just how much writing would in fact allow us to remember. But from a different perspective, Plato was right. The efficient and durable externalization of memory would makes us personally indifferent to remembrance. As the external archive grows, our personal involvement with the memory it stores shrinks in proportion.

Give me a few precious photographs, a few minutes of grainy film and I will treasure them and hold them dear. Give me one terabyte of images and films and I will care not at all.

In the future, we will float in the present untethered from the past and propelled listlessly onward by the perpetual stream of documentary detritus we will emit.

Sergey Brin, co-founder of Google appear

Macro-trends in Technology and Culture

Who would choose cell phones and Twitter over toilets and running water? Well, according to Kevin Kelly, certain rural Chinese farmers. In a recent essay exploring the possibilities of a post-productive economy, Kelly told of the remote villages he visited in which locals owned cell phones but lived in houses without even the most rudimentary forms of plumbing. It is a choice, Kelly notes, deeply influenced by tradition and culture. Kelly’s point may not be quite unassailable, but it is a fair reminder that technology is a culturally mediated phenomenon.

There are, generally speaking, two schools of thought on the relationship between technology and culture. Those tending toward some variety of technological determinism would argue that technology drives culture. Those who tend toward a social constructivist view of technology would argue the opposite. Ultimately, any theory of technology must account for the strengths and weaknesses of both of these tendencies. In fact, the framing of the relationship is probably problematic anyway since there are important ways in which technology is always cultural and culture is always technological.

For the purposes of this post, I’d like to lean toward the social constructivist perspective. No technology appears in a vacuum. It’s origins, evolution, adoption, deployment, and diffusion are all culturally condition. Moreover, the meaning of any technology is always culturally determined; it is never simply given in the form of the technology itself. Historians of technology have reminded us of this reality in numerous fascinating studies — studies of the telephone, for example, and the airplane, the electric grid, household technologies, and much else besides. When a new technology appears, it is interpreted and deployed within an already existing grid of desires, possibilities, necessities, values, symbols, expectations, and constraints. That a technology may re-order this grid in time does not negate the fact that it must first be received by it. The relationship is reciprocal.

If this is true, then it seems to me that we should situate our technologies not only within the immediate historical and social context of their genesis, but also within broader and more expansive historical trajectories. Is our use of computer technology, for example, still inflected by Baconian aspirations? What role do Cartesian dualisms play in shaping our relationship with the world through our technologies? To what degree does Christian eschatology inform technological utopianism? These seem to be important questions, the answers to which might usefully inform our understanding of the place of technology in contemporary society. Of course, these particular questions pertain especially to the West. I suspect another set of questions would apply to non-Western societies and still further questions would be raised within the context of globalization. But again, the basic premise is simply this: a given technology’s social context is not necessarily bounded by its immediate temporal horizons. We ought to be taking the long view as well.

But the rhythms of technological change (and the logic of the tech industry) would seem to discourage us from taking the long view, or at least the long view backwards in time. The pace of technological change over the last two hundred years or so has kept us busy trying to navigate the present, and its trajectory, real and ideal, casts our vision forward in the direction of imagined futures. But what if, as Faulkner quipped, the past with regards to technology is not dead or even past?

I’m wondering, for instance, about these large motive forces that have driven technological innovation in the West, such as the restoration of Edenic conditions or the quest for rational mastery over the natural world leading to the realization of Utopia. These early modern and Enlightenment motive forces directed and steered the evolution of technology in the West for centuries, and I do not doubt that they continue to exert their influence still. Yet, over the last century and half Western society has undergone a series of profound transformations. How have these shaped the evolution of technology? (The inverse question is certainly valid as well.) This is, I suppose, another way of asking about the consequences of post-modernity (which I distinguish from postmodernism) for the history of technology.

In a provocative and compelling post a few months ago, Nick Carr drew an analogy between the course of technological innovation and Maslow’s hierarchy of needs:

“The focus, or emphasis, of innovation moves up through five stages, propelled by shifts in the needs we seek to fulfill. In the beginning come Technologies of Survival (think fire), then Technologies of Social Organization (think cathedral), then Technologies of Prosperity (think steam engine), then technologies of leisure (think TV), and finally Technologies of the Self (think Facebook, or Prozac).”

I continue to find this insightful, and I think the angle I’m taking here dovetails with Carr’s analysis. The technologies of Prosperity and Leisure correspond roughly to the technologies of modernity. Technologies of the Self correspond roughly to the technologies of post-modernity. Gone is our faith in les grands récits that underwrote a variety of utopian visions and steered the evolution of technology. We live in an age of diminished expectations; we long for the fulfillment of human desires writ small. Self-fulfillment is our aim.

This is, incidentally, a trajectory that is nicely illustrated by Lydia DePillis’ suggestion that the massive Consumer Electronics Show “is what a World’s Fair might look like if brands were more important than countries.” The contrast between the world’s fairs and the CES is telling. The world’s fairs, especially those that preceded the 1939 New York fair, were quite obviously animated by thoroughly modern ideologies. They were, as President McKinley put it, “timekeepers of progress,” and one might as well capitalize Progress. On the other hand, whatever we think of the Consumer Electronics Show, it is animated by quite different and more modest spirits. The City of Tomorrow was displaced by the entertainment center of tomorrow before giving way to the augmented self of tomorrow.

Why did technological innovation take this path? Was it something in the nature of technology itself? Or, was it rather a consequence of larger sea changes in the character of society? Maybe a little of both, but probably more of the latter. It’s possible, of course, that this macro-perspective on the the co-evolution of culture and technology can obscure important details and result in misleading generalizations, but if those risks can be mitigated, it may also unveil important trends and qualities that would be invisible to more narrowly focused analysis.

Sex Sells Tech, Still

Yesterday, The New Republic posted “Baudrillard and Babes at the Consumer Electronics Show,” Lydia DePillis’ account of her time at this year’s Consumer Electronic Show. I found the piece slightly less engaging than Matt Honan’s reflections from around this time last year on the 2012 show; reflections which, because they honed in on the fabrication of desire, spoke to something more abiding than the electronics on display. (For links to Honan’s piece, Kevin Kelly’s response, and my own two cents, see “Hole In Our Hearts.”)

But DePillis’ reporting was not without its own commentary on desire. Interestingly, she suggested that, “CES is what a World’s Fair might look like if brands were more important than countries.” This is not far from the mark, but, in fact, as early as 1939, this was already true of the world’s fairs; pavilions by Ford, GE, GM, and other large companies were the real stars of the show. But the fairs provided another interesting/depressing antecedent to the CES that DePillis could have noted.

DePillis’ devotes a few paragraphs to the infamous Booth Babes at CES, and I’ll quote her at length:

MUCH OF THE CES sales force is made up of company staff, who’ve flown in for the show. A good chunk of it, however, is employed for three days only. This latter group, it can safely be said, are hired more for their come-hither qualities than their knowledge of electronics. CES is only marginally less male dominated today than it was in the 1970s. Booth Babes (“brand ambassadors,” in industry parlance) have been an integral part of the show for decades. Regulars had noticed a downturn in recent years, and criticism reached a crescendo in 2012, as the head of the CEA failed to even acknowledge that there might be something mildly distasteful and backwards about using women as props—especially for an industry ostensibly forging into the future.

This year, however, the babes were out in force. At the Chinese phonemaker ZTE, they wore white fur and flesh-colored heels over silver sheath dresses; one stood by the door of a tiny room, luring showgoers in to dance around in front of a green screen that demonstrated a new video technology. At xi3’s booth, women in black catsuits and heavy eye makeup explained the virtues of their employer’s compact servers over pounding hip-hop. Some of the babes aren’t even stationed at booths, instead roaming around in guerrilla bands promoting products.

If CES’s geeky male regulars are so inured to electronics promotions that they don’t crane their necks, it’s not apparent. “I need a t-shirt like that,” leered one portly man to his companion upon spotting a troop of women, their midriffs-baring t-shirts emblazoned with #me on the front and a web service’s URL on the back. “I need what’s in the t-shirt,” his friend cracked back.

Sometimes, it’s not the babes’ job to talk at all. Two statuesque women in red gowns simply stood at the entrance to LG’s show floor all day, their faces impassive. At SIGMA Photo, a model teetered on high pumps and gyrated, her long hair disheveled, as men took turns taking her picture with a giant camera. At Hyper, which sells well-designed battery chargers and external hard drives, I wondered what the four models wearing only g-strings and body paint thought about for hours on end, as people posed for pictures in front of them.

“We just zone out,” said the orange-and-blue girl as she left the booth in sweatpants at the end of the day. “We’re used to being looked at,” explained her silver-painted companion.

It’s a living. The ladies—and some men—list themselves with agencies or just on free sites like Craigslist or ModelMayhem.com, which lets prospective employers shop by waist and cup size and whether you’re willing to pose nude. Trade shows pay between $20 and $50 an hour, they told me. CES has such demand that companies fly models in from California and further afield, putting them up at downmarket places like the Excalibur. “If money’s no object, they want who they want,” said one smokey-eyed platinum blonde from Indianapolis, who’d been told to wear “business attire” for her gig with a maker of protective gadget cases. “The Supertooth booth, over there, they hired the Fantasy Strippers.”

In this respect, the CES is exactly like the world’s fairs.* Here is historian Robert Rydell describing the role played by world’s fairs in transitioning American society from an economy of production to one of consumption:

“Fundamental to this effort was an assault on remaining vestiges of values that were associated with what some historians have called a ‘culture of production.’ To hasten the dissolution of this older emphasis on restraint and inhibition, already under siege by world’s fairs at the beginning of the century and by the steady barrage of advertising that saturated the country during the 1920s, world’s fair builders injected their fantasies of progress with equally heavy doses of technological utopianism and erotic stimulation.”

We’re more familiar with the technological utopianism of the world’s fairs; the manner in which this technological utopianism was alloyed to erotic representations is less commonly noted. For example, Norman Bel Geddes, who famously designed Futurama, the fair’s most popular exhibit, also designed “Crystal Lassies,” “A Peep Show of Tomorrow.” Rydell continues:

“As if to liberate these fantasies from their Victorian moorings, exposition promoters gave increasing prominence to female striptease performances on exposition midways that, by the end of the decade, gave way to fully nude female performers in shows replete with world-of-tomorrow themes.”

Of course, this makes a great deal of sense. Chastity is to sexual desire what thrift is to economic desire. Rydell goes on:

“By suffusing the world of tomorrow with highly charged male sexual fantasies, the century-of-progress expositions not only reconfirmed the status of women as objects of desire, but represented their bodies as showcases that perfectly complemented displays of futuristic consumer durables everywhere on exhibit at the fairs.”

We know sex sells. This is a commonplace in our society. But we often think it operates by association. Pair the model with the car and somehow the attraction to the model will infuse the car. Perhaps. But some marketers appear to have understood the relationship somewhat differently. Eliminate restraint in one domain and you will eliminate it in the other as well.

In any case, sex and technology have been paired in the modern (pre-dominantly male) imagination for quite some time. A dynamic, I might add, which was also on display, though less commented on, in the photo spread for Kevin Kelly’s much discussed essay on our robotic future in Wired. In this respect, DePellis, like Honan, found something quite universal at play among the ephemera of the Consumer Electronics Show.

__________________________________________________

*Most of the following three paragraphs taken from an earlier post.

Photo by Peter Yang
Photo by Peter Yang