The Political Perils of “Big Data”

In “Every Little Byte Counts,” a recent review of two books on “advances in our ability to store, analyze and profit from vast amounts of data generated by our gadgets” (otherwise known as Big Data), Evgeny Morozov makes two observations to which I want to draw your attention. 

The first of these he makes with the help of the Italian philosopher, Giorgio Agamben. Here are Morozov’s first two paragraphs: 

In “On What We Can Not Do,” a short and pungent essay published a few years ago, the Italian philosopher Giorgio Agamben outlined two ways in which power operates today. There’s the conventional type that seeks to limit our potential for self-­development by restricting material resources and banning certain behaviors. But there’s also a subtler, more insidious type, which limits not what we can do but what we can not do. What’s at stake here is not so much our ability to do things but our capacity not to make use of that very ability.

While each of us can still choose not to be on Facebook, have a credit history or build a presence online, can we really afford not to do any of those things today? It was acceptable not to have a cellphone when most people didn’t have them; today, when almost everybody does and when our phone habits can even be used to assess whether we qualify for a loan, such acts of refusal border on the impossible.

This is a profoundly important observation, and it is hardly ever made. In his brief but insightful book, Nature and Altering It, ethicist Allen Verhey articulated a similar concern. Verhey discusses a series of myths that underlie our understanding of nature (earlier he cataloged 16 uses of the idea of “nature”). While discussing one of these myths, the myth of the project of liberal society, Verhey writes,

“Finally, however, the folly of the myth of liberal society is displayed in the pretense that ‘maximizing freedom’ is always morally innocent. ‘Maximizing freedom,’ however, can ironically increase our bondage. What is introduced as a way to increase our options can become socially enforced. The point can easily be illustrated with technology. New technologies are frequently introduced as ways to increase our options, as ways to maximize our freedom, but they can become socially enforced. The automobile was introduced as an option, as an alternative to the horse, but it is now socially enforced …. The technology that surrounds our dying was introduced to give doctors and patients options in the face of disease and death, but such ‘options’ have become socially enforced; at least one sometimes still hears, “We have no choice!” And the technology that may come to surround birth, including pre-natal diagnosis, for example, may come to be socially enforced. ‘What? You knew you were at risk for bearing a child with XYZ, and you did nothing about it? And now you expect help with this child?’ Now it is possible, of course, to claim that cars and CPR and pre-natal diagnosis are the path of progress, but then the argument has shifted from the celebration of options and the maximizing of freedom to something else, to the meaning of progress.”

The second point from Morozov’s review that I want to draw your attention to involves the political consequences of tools that harness the predictive power of Big Data, a power divorced from understanding:

“The predictive models Tucker celebrates are good at telling us what could happen, but they cannot tell us why. As Tucker himself acknowledges, we can learn that some people are more prone to having flat tires and, by analyzing heaps of data, we can even identify who they are — which might be enough to prevent an accident — but the exact reasons defy us.

Such aversion to understanding causality has a political cost. To apply such logic to more consequential problems — health, education, crime — could bias us into thinking that our problems stem from our own poor choices. This is not very surprising, given that the self-tracking gadget in our hands can only nudge us to change our behavior, not reform society at large. But surely many of the problems that plague our health and educational systems stem from the failures of institutions, not just individuals.”

Moreover, as Hannah Arendt put it in The Human Condition, politics is premised on the ability of human beings to “talk with and make sense to each other and to themselves.” Divorcing action from understanding jeopardizes the premise upon which democratic self-governance is founded, the possibility of deliberative judgment. Is it an exaggeration to speak of the prospective tyranny of the algorithm?

I’ll give Morozov the penultimate word:

“It may be that the first kind of power identified by Agamben is actually less pernicious, for, in barring us from doing certain things, it at least preserves, even nurtures, our capacity to resist. But as we lose our ability not to do — here Agamben is absolutely right — our capacity to resist goes away with it. Perhaps it’s easier to resist the power that bars us from using our smartphones than the one that bars us from not using them. Big Data does not a free society make, at least not without basic political judgment.”

I draw your attention to these concerns not because I have an adequate response to them, but because I am increasingly convinced that they are among the most pressing concerns we must grapple with in the years ahead.