From Memory Scarcity to Memory Abundance

The most famous section in arguably the most famous book about photography, Roland Barthes’ Camera Lucida, dwells on a photograph of Barthes’ recently deceased mother taken in a winter garden when she was a little girl. On this picture, Barthes hung his meditative reflections on death and photography. The image evoked both the “that-has-been” reality of the subject, and the haunting “this-will-die” realization. That one photograph of his mother is also the only image discussed by Barthes that was not reproduced in Camera Lucida. It was too personal. It conveyed something true about his mother, but only to him.

But what if Barthes had not a few, but hundreds or even thousands of images of his mother?

I’ve long thought that what was most consequential about social media was their status as prosthetic memories. A site like Facebook, for example, is a massive archive of externalized memories preserved as texts and images. For this reason, it seemed to me, it would be unbearably hard to abandon such sites, particularly for those who had come of age with and through them. These archives bore too precious a record of the past to be simply deleted with a few clicks. I made this argument as late as last night.

But now I’ve realized that I had not fully appreciated the most important dynamic at play. I was operating with assumptions that were formed during an age of relative memory scarcity, but digital photography and sites like Facebook have brought us to an age of memory abundance. The paradoxical consequence of this development will be the progressive devaluing of such memories and severing of the past’s hold on the present. Gigabytes and terabytes of digital memories will not make us care more about those memories, they will make us care less.

We’ve seen the pattern before. Oral societies which had few and relatively inefficient technologies of remembrance at their disposal, lived to remember. Their cultural lives were devoted to ritual and liturgical acts of communal remembering. The introduction of writing, a comparably wondrous technology of remembrance, gradually released the individual from the burdens of cultural remembrance. Memory that could be outsourced, as we say, or offloaded could also be effectively forgotten by  the individual who was free to remember their own history. And it has been to this task that subsequent developments in the technology of remembrance have been put to use. The emergence of cheap paper coupled with rising rates of literacy gave us the diary and the boxes of letters. Photography and the film were also put to the task of documenting our lives. But until recently, these technologies were subject to important constraints. The recording devices were bulky and cumbersome and they were limited in capacity by the number of exposures in a film and the length of ribbon in a tape. There were also important practical constraints on storage and access. Digital technologies have burst through these constraints and they have not yet reached their potential.

Now we carry relatively unobtrusive devices of practically unlimited recording capacity, and these are easily linked to archives that are likewise virtually unlimited in their capacity to store and organize these memories. If we cast our vision into the not altogether distant nor fantastical future, we can anticipate individuals engaging with the world through devices (e.g., Google Glass) that will both augment the physical world by layering it with information and generate a near continuous audio-visual record of our experience.

Compared to these present and soon-to-be technologies, the 35mm camera which was at my disposal through the ’80s and ’90s seems primitive. With regards to a spectrum indicating the capacity to document and archive memories, I was then closer to my pre-modern predecessors than to the generation that will succeed me.

Roland Barthes’ near mystical veneration of his mother’s photograph, touching as it appears to those of us who lived in the age of memory scarcity, will seem quixotic and quaint to those who have known only memory abundance. Barthes will seem to them as those medievals that venerated the physical book do to us. They will be as indifferent to the photograph, and the past it encodes, as we are to the cheap paperback.

It may seem, as it did to me, that social media revived the significance of the past by reconnecting us with friends we would have mostly forgotten and reconstituting habits of social remembering. I’d even expressed concerns that social media might allow the past to overwhelm the present rendering recollection rather than suppression traumatic. But this has only been an effect of novelty upon that transitional generation who had lived without the technology and upon whom it appeared in medias res. For those who have known only the affordances of memory abundance, there will be no reconnection with long forgotten classmates or nostalgic reminiscences around a rare photograph of their youth capturing some trivial, unremembered moment. It will all be documented and archived, but it will mean not a thing.

It will be Barthes’ contemporary, Andy Warhol, who will appear as one of us. In his biography of Warhol, Victor Bockris writes,

Indeed, Andy’s desire to record everything around him had become a mania.  As John Perrault, the art critic, wrote in a profile of Warhol in Vogue:  “His portable tape recorder, housed in a black briefcase, is his latest self-protection device.  The microphone is pointed at anyone who approaches, turning the situation into a theater work.  He records hours of tape every day but just files the reels away and never listens to them.”

Andy Warhol’s performance art will be our ordinary experience, and it is that last line that we should note — “… and he never listens to them.”

Reconsider Plato’s infamous critique of writing. Critics charge Plato with shortsightedness because he failed to see just how much writing would in fact allow us to remember. But from a different perspective, Plato was right. The efficient and durable externalization of memory would makes us personally indifferent to remembrance. As the external archive grows, our personal involvement with the memory it stores shrinks in proportion.

Give me a few precious photographs, a few minutes of grainy film and I will treasure them and hold them dear. Give me one terabyte of images and films and I will care not at all.

In the future, we will float in the present untethered from the past and propelled listlessly onward by the perpetual stream of documentary detritus we will emit.

Sergey Brin, co-founder of Google appear

Macro-trends in Technology and Culture

Who would choose cell phones and Twitter over toilets and running water? Well, according to Kevin Kelly, certain rural Chinese farmers. In a recent essay exploring the possibilities of a post-productive economy, Kelly told of the remote villages he visited in which locals owned cell phones but lived in houses without even the most rudimentary forms of plumbing. It is a choice, Kelly notes, deeply influenced by tradition and culture. Kelly’s point may not be quite unassailable, but it is a fair reminder that technology is a culturally mediated phenomenon.

There are, generally speaking, two schools of thought on the relationship between technology and culture. Those tending toward some variety of technological determinism would argue that technology drives culture. Those who tend toward a social constructivist view of technology would argue the opposite. Ultimately, any theory of technology must account for the strengths and weaknesses of both of these tendencies. In fact, the framing of the relationship is probably problematic anyway since there are important ways in which technology is always cultural and culture is always technological.

For the purposes of this post, I’d like to lean toward the social constructivist perspective. No technology appears in a vacuum. It’s origins, evolution, adoption, deployment, and diffusion are all culturally condition. Moreover, the meaning of any technology is always culturally determined; it is never simply given in the form of the technology itself. Historians of technology have reminded us of this reality in numerous fascinating studies — studies of the telephone, for example, and the airplane, the electric grid, household technologies, and much else besides. When a new technology appears, it is interpreted and deployed within an already existing grid of desires, possibilities, necessities, values, symbols, expectations, and constraints. That a technology may re-order this grid in time does not negate the fact that it must first be received by it. The relationship is reciprocal.

If this is true, then it seems to me that we should situate our technologies not only within the immediate historical and social context of their genesis, but also within broader and more expansive historical trajectories. Is our use of computer technology, for example, still inflected by Baconian aspirations? What role do Cartesian dualisms play in shaping our relationship with the world through our technologies? To what degree does Christian eschatology inform technological utopianism? These seem to be important questions, the answers to which might usefully inform our understanding of the place of technology in contemporary society. Of course, these particular questions pertain especially to the West. I suspect another set of questions would apply to non-Western societies and still further questions would be raised within the context of globalization. But again, the basic premise is simply this: a given technology’s social context is not necessarily bounded by its immediate temporal horizons. We ought to be taking the long view as well.

But the rhythms of technological change (and the logic of the tech industry) would seem to discourage us from taking the long view, or at least the long view backwards in time. The pace of technological change over the last two hundred years or so has kept us busy trying to navigate the present, and its trajectory, real and ideal, casts our vision forward in the direction of imagined futures. But what if, as Faulkner quipped, the past with regards to technology is not dead or even past?

I’m wondering, for instance, about these large motive forces that have driven technological innovation in the West, such as the restoration of Edenic conditions or the quest for rational mastery over the natural world leading to the realization of Utopia. These early modern and Enlightenment motive forces directed and steered the evolution of technology in the West for centuries, and I do not doubt that they continue to exert their influence still. Yet, over the last century and half Western society has undergone a series of profound transformations. How have these shaped the evolution of technology? (The inverse question is certainly valid as well.) This is, I suppose, another way of asking about the consequences of post-modernity (which I distinguish from postmodernism) for the history of technology.

In a provocative and compelling post a few months ago, Nick Carr drew an analogy between the course of technological innovation and Maslow’s hierarchy of needs:

“The focus, or emphasis, of innovation moves up through five stages, propelled by shifts in the needs we seek to fulfill. In the beginning come Technologies of Survival (think fire), then Technologies of Social Organization (think cathedral), then Technologies of Prosperity (think steam engine), then technologies of leisure (think TV), and finally Technologies of the Self (think Facebook, or Prozac).”

I continue to find this insightful, and I think the angle I’m taking here dovetails with Carr’s analysis. The technologies of Prosperity and Leisure correspond roughly to the technologies of modernity. Technologies of the Self correspond roughly to the technologies of post-modernity. Gone is our faith in les grands récits that underwrote a variety of utopian visions and steered the evolution of technology. We live in an age of diminished expectations; we long for the fulfillment of human desires writ small. Self-fulfillment is our aim.

This is, incidentally, a trajectory that is nicely illustrated by Lydia DePillis’ suggestion that the massive Consumer Electronics Show “is what a World’s Fair might look like if brands were more important than countries.” The contrast between the world’s fairs and the CES is telling. The world’s fairs, especially those that preceded the 1939 New York fair, were quite obviously animated by thoroughly modern ideologies. They were, as President McKinley put it, “timekeepers of progress,” and one might as well capitalize Progress. On the other hand, whatever we think of the Consumer Electronics Show, it is animated by quite different and more modest spirits. The City of Tomorrow was displaced by the entertainment center of tomorrow before giving way to the augmented self of tomorrow.

Why did technological innovation take this path? Was it something in the nature of technology itself? Or, was it rather a consequence of larger sea changes in the character of society? Maybe a little of both, but probably more of the latter. It’s possible, of course, that this macro-perspective on the the co-evolution of culture and technology can obscure important details and result in misleading generalizations, but if those risks can be mitigated, it may also unveil important trends and qualities that would be invisible to more narrowly focused analysis.

Sex Sells Tech, Still

Yesterday, The New Republic posted “Baudrillard and Babes at the Consumer Electronics Show,” Lydia DePillis’ account of her time at this year’s Consumer Electronic Show. I found the piece slightly less engaging than Matt Honan’s reflections from around this time last year on the 2012 show; reflections which, because they honed in on the fabrication of desire, spoke to something more abiding than the electronics on display. (For links to Honan’s piece, Kevin Kelly’s response, and my own two cents, see “Hole In Our Hearts.”)

But DePillis’ reporting was not without its own commentary on desire. Interestingly, she suggested that, “CES is what a World’s Fair might look like if brands were more important than countries.” This is not far from the mark, but, in fact, as early as 1939, this was already true of the world’s fairs; pavilions by Ford, GE, GM, and other large companies were the real stars of the show. But the fairs provided another interesting/depressing antecedent to the CES that DePillis could have noted.

DePillis’ devotes a few paragraphs to the infamous Booth Babes at CES, and I’ll quote her at length:

MUCH OF THE CES sales force is made up of company staff, who’ve flown in for the show. A good chunk of it, however, is employed for three days only. This latter group, it can safely be said, are hired more for their come-hither qualities than their knowledge of electronics. CES is only marginally less male dominated today than it was in the 1970s. Booth Babes (“brand ambassadors,” in industry parlance) have been an integral part of the show for decades. Regulars had noticed a downturn in recent years, and criticism reached a crescendo in 2012, as the head of the CEA failed to even acknowledge that there might be something mildly distasteful and backwards about using women as props—especially for an industry ostensibly forging into the future.

This year, however, the babes were out in force. At the Chinese phonemaker ZTE, they wore white fur and flesh-colored heels over silver sheath dresses; one stood by the door of a tiny room, luring showgoers in to dance around in front of a green screen that demonstrated a new video technology. At xi3’s booth, women in black catsuits and heavy eye makeup explained the virtues of their employer’s compact servers over pounding hip-hop. Some of the babes aren’t even stationed at booths, instead roaming around in guerrilla bands promoting products.

If CES’s geeky male regulars are so inured to electronics promotions that they don’t crane their necks, it’s not apparent. “I need a t-shirt like that,” leered one portly man to his companion upon spotting a troop of women, their midriffs-baring t-shirts emblazoned with #me on the front and a web service’s URL on the back. “I need what’s in the t-shirt,” his friend cracked back.

Sometimes, it’s not the babes’ job to talk at all. Two statuesque women in red gowns simply stood at the entrance to LG’s show floor all day, their faces impassive. At SIGMA Photo, a model teetered on high pumps and gyrated, her long hair disheveled, as men took turns taking her picture with a giant camera. At Hyper, which sells well-designed battery chargers and external hard drives, I wondered what the four models wearing only g-strings and body paint thought about for hours on end, as people posed for pictures in front of them.

“We just zone out,” said the orange-and-blue girl as she left the booth in sweatpants at the end of the day. “We’re used to being looked at,” explained her silver-painted companion.

It’s a living. The ladies—and some men—list themselves with agencies or just on free sites like Craigslist or ModelMayhem.com, which lets prospective employers shop by waist and cup size and whether you’re willing to pose nude. Trade shows pay between $20 and $50 an hour, they told me. CES has such demand that companies fly models in from California and further afield, putting them up at downmarket places like the Excalibur. “If money’s no object, they want who they want,” said one smokey-eyed platinum blonde from Indianapolis, who’d been told to wear “business attire” for her gig with a maker of protective gadget cases. “The Supertooth booth, over there, they hired the Fantasy Strippers.”

In this respect, the CES is exactly like the world’s fairs.* Here is historian Robert Rydell describing the role played by world’s fairs in transitioning American society from an economy of production to one of consumption:

“Fundamental to this effort was an assault on remaining vestiges of values that were associated with what some historians have called a ‘culture of production.’ To hasten the dissolution of this older emphasis on restraint and inhibition, already under siege by world’s fairs at the beginning of the century and by the steady barrage of advertising that saturated the country during the 1920s, world’s fair builders injected their fantasies of progress with equally heavy doses of technological utopianism and erotic stimulation.”

We’re more familiar with the technological utopianism of the world’s fairs; the manner in which this technological utopianism was alloyed to erotic representations is less commonly noted. For example, Norman Bel Geddes, who famously designed Futurama, the fair’s most popular exhibit, also designed “Crystal Lassies,” “A Peep Show of Tomorrow.” Rydell continues:

“As if to liberate these fantasies from their Victorian moorings, exposition promoters gave increasing prominence to female striptease performances on exposition midways that, by the end of the decade, gave way to fully nude female performers in shows replete with world-of-tomorrow themes.”

Of course, this makes a great deal of sense. Chastity is to sexual desire what thrift is to economic desire. Rydell goes on:

“By suffusing the world of tomorrow with highly charged male sexual fantasies, the century-of-progress expositions not only reconfirmed the status of women as objects of desire, but represented their bodies as showcases that perfectly complemented displays of futuristic consumer durables everywhere on exhibit at the fairs.”

We know sex sells. This is a commonplace in our society. But we often think it operates by association. Pair the model with the car and somehow the attraction to the model will infuse the car. Perhaps. But some marketers appear to have understood the relationship somewhat differently. Eliminate restraint in one domain and you will eliminate it in the other as well.

In any case, sex and technology have been paired in the modern (pre-dominantly male) imagination for quite some time. A dynamic, I might add, which was also on display, though less commented on, in the photo spread for Kevin Kelly’s much discussed essay on our robotic future in Wired. In this respect, DePellis, like Honan, found something quite universal at play among the ephemera of the Consumer Electronics Show.

__________________________________________________

*Most of the following three paragraphs taken from an earlier post.

Photo by Peter Yang
Photo by Peter Yang

Pensées: An Imaginative Thought Experiment

Imagine a not too distant future in which there exists a café of the sort that you would expect to find in a trendy urban district where young professionals and aspiring artists gather to work, to socialize, and, of course, to be seen.

This café is different, though. There are no tables or couches; no bar stools or lounge chairs. There are, however, a series of numbered doors lining the interior walls. Above each door, a digital clock counts down from assorted and seemingly random times. Occasionally, a faint thumping can be heard, but it is indistinct and barely noticeable.

Customers enter the café and approach the service bar. They order a latte or an espresso or chai and they sign some papers. They pay while they wait for their drink, and, when it is ready, they take it along with a bracelet and a plastic card.

Drink in hand, they make their way to one of the numbered doors, swipe the card to unlock it, and they walk inside. Behind them the door closes and locks automatically. Outside, above the door, the digital clock above the door begins to count down from three hours. Inside, the patron sets his coffee down at a bare table and pulls out the lone chair. The acoustic foam lining the walls makes every sound palpable: the unzipping of the laptop bag; the placing, gently, of the ultra-slim computer on the table; the first few keystrokes.

A glance at the signal strength indicator confirms what has been agreed upon: no wireless signal. So too does a tug on the door handle: locked, from the outside.  And it is as quiet as promised, except for the surprisingly audible thumping of the heart.

Clever proprietors had discovered that people are now willing to pay to be kept, for a period of time, in an enforced state of un-distractedness.  Years earlier, certain applications had promised something similar. They offered Freedom from distraction by preventing a device from connecting to the Internet for a pre-determined period of time. But this extension of the will proved too easily circumvented. A more radical cure was needed.

Having signed the appropriate legal waivers, customers at Pensées were securely locked into their cells so that they may work, without interruption, on whatever task needed their undivided attention. The bracelets monitored their vital signs in the event of a medical emergency. Barring such an emergency, proprietors pledged to keep the door closed without exception. (Patrons were aware that cameras monitored the inside of each cell; only legal and legitimate work was to be done within, naturally.) It was not uncommon, then, for some patrons eventually to demand, by sometimes frantic gesticulations, that they be allowed to exit.

Such requests were always denied as a matter of course. It was for this denial of their misguided desire, after all, that they had paid their good money.

Those who came to Pensées, and to similar establishments, had discovered by then that their unaided will could not be trusted. They came to be productive: to finish their papers or work on their manuscripts and screenplays. Some came simply to sit and think. The more religious, came to pray or to meditate.

Such acts may have been possible outside the soundproof walls of Pensées’ cells, but this was merely a theoretical possibility to most. (Of course, those who ran Pensées never suggested that, even within the walls of their cells, the possibility remained thoroughly theoretical.)

Inside their cells, the experience of patrons proceeded along a surprisingly predictable path. With eager hopefulness they set up their workspace just so and launched, almost giddy, into their work. Within minutes, sometimes seconds, they would casually laugh off the urge they suddenly felt to check their smartphones for some incoming message or alert. They had no signal, they knew, but the urge persisted. They felt silly when they took out their smartphones to confirm what they already knew to be the case. Then, they put it away with a self-knowing smirk; or, rather, they set it down within view of their peripheral vision. No harm done since there was no signal, but, annoyingly, glances followed.

Between glances, eyes would flit toward that place on the screen were numbers in parenthesis would signal new items of various sorts that required attention. But there were none of these either, just as had been hoped for and paid for. But it was increasingly frustrating to catch oneself repeatedly looking anyway.

After a few minutes of this, work would resume, but in bursts punctuated by periods of wandering thoughts, random observations, and disjointed inner monologues. Perhaps decaf would be better next time. It’s hard to focus when a muscle twitches involuntarily. The inability to voluntarily direct one’s attention was bad enough; that the body would now prove equally unruly was dispiriting.

It was not unusual for some to then reach for their smartphones, almost unconsciously, and then to handle it as if it were their rosary beads.* Or they may stand up and pace about the cramped but comfortable cell, not yet anxiously, only to get the blood flowing before sitting down to work with renewed focus. And so they did, for a short while, before they began to wonder if it was not absurd to pay to be locked in a room. And how much time had gone by they wondered? The phone, at least, was still good for that – to register the fact that hardly any time at all had passed.

Some then grew anxious as they fixated on time, which advanced glacially. Silence, for which they had been willing, just minutes before, to pay, now seemed oppressive. No work was being accomplished, and most thoughts that were thought turned out to be depressingly banal; those that were not were disconcerting.

They turned to the camera and wondered just how serious the proprietors were about refusing to allow patrons to exit their cells. Quite serious, it always turned out, despite the desperate pounding of some. Anxiety attacks did not, according to the terms agreed to, constitute a medical emergency.

Some eventually fell asleep. Some went back to their desks to eek out some semblance of work so that they would have something, at least, to show for their ordeal. Others stared blankly at the door, straining to hear the gentle tone that would signal the end of their time in the cell.

When it came, some patrons exited hurriedly and others stumbled out, bewildered. A few tried to make a good show of it, walking off with whatever air of accomplishment they could feign. Most were seen eagerly staring at their smartphone waiting for it to come online. Surprisingly, every so often, there were some who walked away looking as if they’d learned something rather important. About what, exactly, it was never clear.

_______________________________________

*An image I owe to Ian Bogost.

Downton Abbey and Technology

Tonight American audiences rekindle their fascination with Downton Abbey, the popular BBC series set in a stately English manor during the early twentieth century. It is a series that dramatizes the decline and dissolution of a world shorn apart by the violent winds of social change. In the series, the Great War, the women’s movement, socialism, and other contemporaneous developments chip away at the old order. But the opening scene of the pilot episode also strongly suggests that this older world is giving way to the forces of technological change. Consider the first two minutes:

The tapping of telegraph, the whistle of the locomotive, and the curves of power lines all feature prominently in these opening shots. And so too does the sinking of the Titanic, a near mythical case study in the dangers of technological hubris. Strikingly, the telegraph lines and the progress of the train are juxtaposed with idyllic country scenes. It is a filmic version of a prominent nineteenth century literary convention.

Consider the following passage from Nathaniel Hawthorne’s journal. While enjoying the enchantments of the natural (and cultural) world around him, Hawthorne is startled:

“But, hark! there is the whistle of the locomotive — the long shriek, harsh, above all other harshness, for the space of a mile cannot mollify it into harmony. It tells a story of busy men, citizens, from the hot street, who have come to spend a day in a country village, men of business; in short of all unquietness; and no wonder that it gives such a startling shriek, since it brings the noisy world into the midst of our slumbrous peace.”

This passage is a point of departure for Leo Marx’s classic study of technology in the American literary tradition, The Machine in the Garden. Similar vignettes were a recurring feature in the literature of the nineteenth century. For Hawthorne, Emerson, and many of their contemporaries, the train whistle signaled the industrial machine’s disruption of a pastoral ideal in which human culture blended harmoniously with nature.

The opening scenes of Downton Abbey fit neatly within this genre. Of course, Hawthorne was writing about what were for him contemporary realities. We are far removed from the historical setting of Downton Abbey. For us the train whistle signals little more than a break in traffic and the telegraph is merely quaint. But I wonder whether the popularity of Downton Abbey stems, at least in part, from its evocation of the specter of disruptive technological change. It offers the anxieties of a safely distant age as a proxy for our own, and, perhaps, in doing so it also offers something like a cathartic experience for viewers.

Perhaps it merely traffics in nostalgia for an idyllic age, but I doubt it. We know from the outset that there is a snake in the garden. The train, the power lines, the Titanic — they are so many momento mori littering the scene. More likely we are like the Angel of History in Benjamin’s reading of Klee’s Angelus Novus:

“His face is turned toward the past. Where we perceive a chain of events, he sees one single catastrophe which keeps piling wreckage upon wreckage and hurls it in front of his feet. The angel would like to stay, awaken the dead, and make whole what has been smashed. But a storm is blowing from Paradise; it has got caught in his wings with such violence that the angel can no longer close them. The storm irresistibly propels him into the future to which his back is turned, while the pile of debris before him grows skyward. This storm is what we call progress.”

As the storm blows us onward we can’t help but glance back at the wreckage of the past piling up behind us … or some often romanticized, frequently commodified representation of it that may or may not bear any resemblance to historical realities.

Of course, it may just be the memes.