The Political Perils of “Big Data”

In “Every Little Byte Counts,” a recent review of two books on “advances in our ability to store, analyze and profit from vast amounts of data generated by our gadgets” (otherwise known as Big Data), Evgeny Morozov makes two observations to which I want to draw your attention. 

The first of these he makes with the help of the Italian philosopher, Giorgio Agamben. Here are Morozov’s first two paragraphs: 

In “On What We Can Not Do,” a short and pungent essay published a few years ago, the Italian philosopher Giorgio Agamben outlined two ways in which power operates today. There’s the conventional type that seeks to limit our potential for self-­development by restricting material resources and banning certain behaviors. But there’s also a subtler, more insidious type, which limits not what we can do but what we can not do. What’s at stake here is not so much our ability to do things but our capacity not to make use of that very ability.

While each of us can still choose not to be on Facebook, have a credit history or build a presence online, can we really afford not to do any of those things today? It was acceptable not to have a cellphone when most people didn’t have them; today, when almost everybody does and when our phone habits can even be used to assess whether we qualify for a loan, such acts of refusal border on the impossible.

This is a profoundly important observation, and it is hardly ever made. In his brief but insightful book, Nature and Altering It, ethicist Allen Verhey articulated a similar concern. Verhey discusses a series of myths that underlie our understanding of nature (earlier he cataloged 16 uses of the idea of “nature”). While discussing one of these myths, the myth of the project of liberal society, Verhey writes,

“Finally, however, the folly of the myth of liberal society is displayed in the pretense that ‘maximizing freedom’ is always morally innocent. ‘Maximizing freedom,’ however, can ironically increase our bondage. What is introduced as a way to increase our options can become socially enforced. The point can easily be illustrated with technology. New technologies are frequently introduced as ways to increase our options, as ways to maximize our freedom, but they can become socially enforced. The automobile was introduced as an option, as an alternative to the horse, but it is now socially enforced …. The technology that surrounds our dying was introduced to give doctors and patients options in the face of disease and death, but such ‘options’ have become socially enforced; at least one sometimes still hears, “We have no choice!” And the technology that may come to surround birth, including pre-natal diagnosis, for example, may come to be socially enforced. ‘What? You knew you were at risk for bearing a child with XYZ, and you did nothing about it? And now you expect help with this child?’ Now it is possible, of course, to claim that cars and CPR and pre-natal diagnosis are the path of progress, but then the argument has shifted from the celebration of options and the maximizing of freedom to something else, to the meaning of progress.”

The second point from Morozov’s review that I want to draw your attention to involves the political consequences of tools that harness the predictive power of Big Data, a power divorced from understanding:

“The predictive models Tucker celebrates are good at telling us what could happen, but they cannot tell us why. As Tucker himself acknowledges, we can learn that some people are more prone to having flat tires and, by analyzing heaps of data, we can even identify who they are — which might be enough to prevent an accident — but the exact reasons defy us.

Such aversion to understanding causality has a political cost. To apply such logic to more consequential problems — health, education, crime — could bias us into thinking that our problems stem from our own poor choices. This is not very surprising, given that the self-tracking gadget in our hands can only nudge us to change our behavior, not reform society at large. But surely many of the problems that plague our health and educational systems stem from the failures of institutions, not just individuals.”

Moreover, as Hannah Arendt put it in The Human Condition, politics is premised on the ability of human beings to “talk with and make sense to each other and to themselves.” Divorcing action from understanding jeopardizes the premise upon which democratic self-governance is founded, the possibility of deliberative judgment. Is it an exaggeration to speak of the prospective tyranny of the algorithm?

I’ll give Morozov the penultimate word:

“It may be that the first kind of power identified by Agamben is actually less pernicious, for, in barring us from doing certain things, it at least preserves, even nurtures, our capacity to resist. But as we lose our ability not to do — here Agamben is absolutely right — our capacity to resist goes away with it. Perhaps it’s easier to resist the power that bars us from using our smartphones than the one that bars us from not using them. Big Data does not a free society make, at least not without basic political judgment.”

I draw your attention to these concerns not because I have an adequate response to them, but because I am increasingly convinced that they are among the most pressing concerns we must grapple with in the years ahead.

7 thoughts on “The Political Perils of “Big Data”

  1. Michael,
    What Giorgio Agamben describes is what Langdon Winner calls “the technological imperative” and what I call in my book “de facto technological autonomy.” Here’s part of what I say about it:

    “You can choose to live without a car, for example, but not without the pollution cars create. You don’t have to watch television, but the politicians who run your government will be elected by people who do. You can choose to live without a computer, but good luck finding a job. Being a committed pacifist won’t protect you if the bombs fly.

    Simply put, our entire social system runs on a vast array of technologies and would collapse within a day or two without them.”

  2. Agamben’s consideration of our human powers to embrace and renounce technology reminds me of Illich’s observation, Needs are the greatest tyrant of all. As our needs are coopted by technology, as, as he wrote, thirst is transformed into a need for a Coke, it will become more difficult to say, No, with any authority. For myself, not using Facebook, Twitter, or any social media has cost me exposure as a song-writer; creatively, a song-writer cannot exist, apparently, without a web-site of some kind, whether free or paid, but even renouncing e-mail can be subversive in its own way.

    I don’t know if renunciation of complete technology is the answer, as its renunciation would preclude conversations, if that’s what replying to a weblog post could be considered, such as these. There’s something sectarian in that renunciation, something in the difficulty that such postures induce that reminds us that something else is possible. Neil Postman, author of the monograph, Amusing Ourselves to Death, certainly hasn’t suffered from the lack of access to the Internet, e-mail, and the like, nor has Wendell Berry.

    North of the Medicine Line, tele-communication companies, nine of them, have released, because of freedom of information requests, an undisclosed amount of user-data to undisclosed government agencies. A telephone call to our local provider has revealed that they are not sure whether my data has been released, nor to whom, but I should not be worried about it, as any data that has been released has been nothing more than my personal data. To say that this was a discouraging response, not to mention disconcerting, is to put it mildly and a request to our federal privacy commissioner will be following, soon.

    Doug’s response reminds me of Illich’s observation that the professionalization of care and of humane vocations has robbed us of our ability to be self-reliant, to persist in the glory of subsistence, to depend on our own abilities.

  3. Reblogged this on Life Redeux and commented:
    Cannot help but think that the advances discussed in this article would not be so if they had not been co-opted and subverted by government and industry.

  4. Michael,

    Very good post.

    Even though each of us is the indispensable element in creating the data of our interactions with coded objects and a pervasively coded society, that data in effect does not belong to us. We can’t opt out in any meaningful way of the collection and use of our data, even though others monetize it.

    The recent decision by the European Court of Justice establishing in concept, a right to be forgotten, still relies on the data companies to make the initial decision on an individual’s request to delete data the individual finds harmful. It’s unlikely such a limited approach would be broadly acceptable in the US.

    I’ve written a post on the ECJ decision here:

  5. Thank you all for the responses. I’ve been quite delinquent in my replies to comments, but I’m now working my way back through the past few weeks’ worth.

    Doug, that’s quite right, if also disconcerting.

    Joshua, the link to Illich is an important one, and one I think Doug would endorse.

    Lastly, atomicgeography, following the course of that ruling will be interesting indeed. Relatedly, I just stumbled on this lecture yesterday, and I think you’d appreciate it:

  6. About 18 months ago, I embarked on the project of reading Capital, volume 1. I still haven’t finished, because I keep getting tugged down richly rewarding avenues of explorations. This post is one of them. Capitalism is the air we breathe and the ideas discussed in both this post and in the comments are clear evidence of that statement. Capitalist ideology has penetrated to the inner-most recesses of our minds – it is going to be almost impossible to break free.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s