The World Will Be Our Skinner Box

Writing about two recent patents by Google, Sydney Fussell makes the following observations about the possibility of using a “smart” environments to create elaborate architectures for social engineering:

For reward systems created by either users or companies to be possible, the devices would have to know what you’re doing at all times. The language of these patents make it clear that Google is acutely aware of the powers of inference it has already, even without cameras, by augmenting speakers to recognize the noises you make as you move around the house. The auditory inferences are startling: Google’s smart home system can infer “if a household member is working” from “an audio signature of keyboard clicking, a desk chair moving, and/or papers shuffling.” Google can make inferences on your mood based on whether it hears raised voices or crying, when you’re in the kitchen based on the sound of the fridge door opening, your dental hygiene based on “the sounds and/or images of teeth brushing.”

I read Fussell’s article right after I read this tweet from Kevin Bankston: “Ford’s CEO just said on NPR that the future of profitability for the company is all the data from its 100 million vehicles (and the people in them) they’ll be able to monetize. Capitalism & surveillance capitalism are becoming increasingly indistinguishable (and frightening).”

I thought about that tweet for awhile. It really is the case that data is to the digital economy not unlike what oil has been for the industrial economy. Jathan Sadowski notes as much in a comment on the same tweet: “Good reminder that as the ‘operations of capital’ adapt to the digital age, they maintain the same essential features of extraction, exploitation, and accumulation. Rather than a disruption, surveillance capitalism is more like old wine poured into new ‘smart’ bottles.”

It also recalls Evgeny Morozov’s discussions of data mining or data extractivism and Nick Carr’s suggestion that we think rather along the lines of a factory metaphor: “The factory metaphor makes clear what the mining metaphor obscures: ‘We work for the Facebooks and Googles of the world, and the work we do is increasingly indistinguishable from the lives we lead.'”

Taking all of this together, I found myself asking what all of this looks like if we try to extrapolate into the future (a risky venture, I concede). There are two interrelated trends here as far as I can see: toward surveillance and conditioning. The trends, in other words, lead toward a world turned into a Skinner box.

I think it was from Carr that I first encountered the Skinner box analogy applied to the internet a few years ago. Now that digital connectedness has extended beyond the information on our screens to the material environment, the Skinner box, too, has grown.

As Rick Searle, points out in his discussion of B. F. Skinner’s utopian novel, Walden Two,

We are the inheritors of a darker version of Skinner’s freedomless world—though by far not the darkest. Yet even should we get the beneficent paternalism contemporary Skinnerites—such as Richard Thaler and Cass R. Sunstein who wish to “nudge” us this-way-and-that, it would harm freedom not so much by proving that our decisions are indeed in large measure determined by our environment as from the fact that the shape of that environment would be in the hands of someone other than ourselves, individually and collectively.

This, of course, is also the theme of Frischmann and Selinger’s Reengineering Humanity.  It recalls, as well, a line from Hannah Arendt, “The trouble with modern theories of behaviorism is not that they are wrong but that they could become true.”

But there is something else Arendt wrote that I’ve kept coming back to whenever I’ve thought about these trajectories toward ever more sophisticated mechanisms for the extraction of data to fuel what Frischmann and Selinger have call the project of engineered determinism. “There is only one thing,” Arendt claimed, “that seems discernible: we may say that radical evil has emerged in connection with a system in which all men have become equally superfluous.” What we must remember about “our” data for the purposes of social engineering is that the least important thing about it is that it is “ours.” It matters chiefly as an infinitesimally small drop in a vast ocean of data, which fuels the tools of prediction and conditioning. You and I, in our particularity, are irrelevant, superfluous.

Arendt described the model citizen of a totalitarian state in this way: “Pavlov’s dog, the human specimen reduced to the most elementary reactions, the bundle of reactions that can always be liquidated and replaced by other bundles of reactions that behave in exactly the same way.”

Ominously, she warned, “Totalitarian solutions may well survive the fall of totalitarian regimes in the form of strong temptations which will come up whenever it seems impossible to alleviate political, social, or economic misery in a manner worthy of man.”

In Arendt’s time, totalitarianism emerged in what we might think of as Orwellian guise. The threat we face, however, will come in Huxleyan guise: we will half-knowingly embrace the regime that promises us a reasonable measure of stability, control, and predictability, what we mistake for the conditions of happiness. Paradoxically, the technologies we increasingly turn to in order to manage the chaotic flux of life are also the technologies that have generated the flux we seek to tame.


If you’ve appreciated what you’ve read, consider supporting the writer: Patreon/Paypal.

3 thoughts on “The World Will Be Our Skinner Box

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s