Troubles We Must Not Refuse

If you’re not paying attention to Evan Selinger’s work, you’re missing out on some of the best available commentary on the ethical implications of contemporary technology. Last week I pointed you to his recent essay, “The Outsourced Lover,” on a morally questionable app designed to automate romantic messages to your significant other. In a more recent editorial at Wired, “Today’s Apps Are Turning Us Into Sociopaths,” Selinger provides another incisive critique of an app that similarly automates aspects of interpersonal relationships.

Selinger approached his piece by interviewing the app designers in order to understand the rationale behind their product. This leads into an interesting and broad discussion about technological determinism, technology’s relationship to society, and ethics.

I was particularly intrigued by how assumptions of technological inevitability were deployed. Take the following, for example:

“Embracing this inevitability, the makers of BroApp argue that ‘The pace of technological change is past the point where it’s possible for us to reject it!’”

And:

“’If there is a niche to be filled: i.e. automated relationship helpers, then entrepreneurs will act to fill that niche. The combinatorial explosion of millions of entrepreneurs working with accessible technologies ensures this outcome. Regardless of moral ambiguity or societal push-back, if people find a technology useful, it will be developed and adopted.’”

It seems that these designers have a pretty bad case of the Borg Complex, my name for the rhetoric of technological determinism. Recourse to the language of inevitability is the defining symptom of a Borg Complex, but it is not the only one exhibited in this case.

According to Selinger, they also deploy another recurring trope: the dismissal of what are derisively called “moral panics” based on the conclusion that they amount to so many cases of Chicken Little, and the sky never falls. This is an example of another Borg Complex symptom: “Refers to historical antecedents solely to dismiss present concerns.” You can read my thoughts on that sort of reasoning here.

Do read the whole of Selinger’s essay. He’s identified an important area of concern, the increasing ease with which we may outsource ethical and emotional labor to our digital devices, and he is helping us think clearly and wisely about it.

About a year ago, Evgeny Morozov raised related concerns that prompted me to write about the inhumanity of smart technology. A touch of hyperbole, perhaps, but I do think the stakes are high. I’ll leave you with two points drawn from that older post.

The first:

“Out of the crooked timber of humanity no straight thing was ever made,” Kant observed. Corollary to keep in mind: If a straight thing is made, it will be because humanity has been stripped out of it.

The second relates to a distinction Albert Borgmann drew some time ago between troubles we accept in practice and those we accept in principle. Those we accept in practice are troubles we need to cope with but which we should seek to eradicate, take cancer for instance. Troubles we accept in principle are those that we should not, even if we were able, seek to abolish. These troubles are somehow essential to the full experience of our humanity and they are an irreducible component of those practices which bring us deep joy and satisfaction.

That’s a very short summary of a very substantial theory. You can read more about it in that earlier post and in this one as well. I think Borgmann’s point is critical. It applies neatly to the apps Selinger has been analyzing. It also speaks to the temptations of smart technology highlighted by Morozov, who rightly noted,

“There are many contexts in which smart technologies are unambiguously useful and even lifesaving. Smart belts that monitor the balance of the elderly and smart carpets that detect falls seem to fall in this category. The problem with many smart technologies is that their designers, in the quest to root out the imperfections of the human condition, seldom stop to ask how much frustration, failure and regret is required for happiness and achievement to retain any meaning.”

From another angle, we can understand the problem as a misconstrual of the relationship between means and ends. Technology, when it becomes something more than an assortment of tools, when it becomes a way of looking at the world, technique in Jacques Ellul’s sense, fixates on means at the expense of ends. Technology is about how things get done, not what ought to get done or why. Consequently, we are tempted to misconstrue means as ends in themselves, and we are also encouraged to think of means as essentially interchangeable. We simply pursue the most efficient, effective means. Period.

But means are not always interchangeable. Some means are integrally related to the ends that they aim for. Altering the means undermines the end. The apps under consideration, and many of our digital tools more generally, proceed on the assumption that means are, in fact, interchangeable. It doesn’t matter whether you took the time to write out a message to your loved one or whether it was an automated app that only presents itself as you. So long as the end of getting your loved one a message is accomplished, the means matter not.

This logic is flawed precisely because it mistakes a means for an end and sees means as interchangeable. The real end, of course, in this case anyway, is a loving relationship not simply getting a message that fosters the appearance of a loving relationship. And the means toward that end are not easily interchangeable. The labor, or, to use Borgmann’s phrasing, the trouble required by the fitting means cannot be outsources or eliminated without fatally undermining the goal of a loving relationship.

That same logic plays out across countless cases where a device promises to save us or unburden us from moral and emotional troubles. It is a dehumanizing logic.

When the Problem is Technology, the Answer is More Technology

In 1954, Martin Heidegger published “The Question Concerning Technology” in which he observed,

“[T]he instrumental conception of technology conditions every attempt to bring man into the right relation to technology. Everything depends on our manipulating technology in the proper manner as a means. We will, as we say, ‘get’ technology ‘spiritually in hand’. We will master it. The will to mastery becomes all the more urgent the more technology threatens to slip from our control.”

Not sure what Heidegger meant? Josh Cohen provides a contemporary illustration:

“Wang is the co-founder of Lumo BodyTech, a company that produces pioneering devices designed to enhance a user’s posture. Their lead product is the LUMOBack Posture Sensor, which triggers warning vibrations the moment you slouch. Given that poor posture is a key symptom of compulsive absorption in our laptops and phones, this product is not merely a physical corrective, says Wang, but the harbinger of a new ‘mindfulness’, a means of awakening the self from its high-tech slumber.

So, our mortally anxious distraction by tracking devices is to be finally arrested by…a tracking device. You can only be struck at this point by Wang’s genial indifference to what he’s actually saying. Self-tracking, he declares at a conference promoting the practice, corrodes social and emotional ties, engenders helpless dependence on technology and endangers physical health. But thank goodness I’ve patented a new self-tracking device, he concludes, impervious to either the irony in his catastrophic diagnosis of collective technological alienation and his proposed remedy of a posture sensor.”

 

Utopias of Communication

“The more any medium triumphed over distance, time, and embodied presence, the more exciting it was, and the more it seemed to tread the path of the future … And as always, new media were thought to hail the dawning of complete cross-cultural understanding, since contact with other cultures would reveal people like those at home. Only physical barriers between cultures were acknowledged. When these were overcome, appreciation and friendliness would reign.”

That is Carolyn Marvin discussing nineteenth century assumptions about telegraphic and telephonic communication, not the similarly utopian assumptions made about the Internet. Then as now, the reality fell short of the ideal.

“Assumptions like this required their authors to position themselves at the moral center of the universe, and they did. They were convinced that it belonged to them on the strength of their technological achievements.”

More:

“The capacity to reach out to the Other seemed rarely to involve any obligation to behave as a guest in the Other’s domain, to learn or appreciate the Other’s customs, to speak his language, to share his victories and disappointments, or to change as a result of any encounter with him.”

Finally, the original electronic filter bubble:

“Predictably, the experience of contact between distant cultures met few expectations of mutual recognition. For Thomas Stevens, a British telegraph operator in Persia responsible at the most personal level for bringing the kinship of humanity closer to fruition, the telegraph was not a device to facilitate contact with a remarkably different and fascinating culture, but an intellectual and spiritual restorative in a cultural as well as physical desert. ‘How companionable it was, that bit of civilization in a barbarous country, only those who have been similarly placed know.’ … The telegraph represented ‘a narrow streak of modern civilization through all that part of Asia.’ Europeans as far apart as two thousand miles, who had never seen one another, were well acquainted.”

Quotations drawn from Marvin’s When Old Technologies Were New: Thinking About Electric Communication in the Late Nineteenth Century (1990).

Choice and the Machine

“Choice manifests itself in society in small increments and moment-to-moment decisions as well as in loud dramatic struggles; and he who does not see choice in the development of the machine merely betrays his incapacity to observe cumulative effects until they are bunched together so closely that they seem completely external and impersonal. No matter how completely technics relies upon the objective procedures of the sciences, it does no form an independent system, like the universe: it exists as an element in human culture and it promises well or ill as the social groups that exploit it promise well or ill. The machine itself makes no demands and holds out no promises: it is the human spirit that makes demands and keeps promises. In order to reconquer the machine and subdue it to human purposes, one must first understand it and assimilate it. So far, we have embraced the machine without fully understanding it, or, like the weaker romantics, we have rejected the machine without first seeing how much of it we could intelligently assimilate.

The machine itself, however, is a product of human ingenuity and effort: hence to understand the machine is not merely a first step toward re-orienting our civilization: it is also a means toward understanding society and toward knowing ourselves.”

Lewis Mumford in the introduction to Technics and Civilization (1934).

Love’s Labor Outsourced

On Valentine’s Day, The Atlantic’s tech site ran a characteristically thoughtful piece by Evan Selinger examining a new app called Romantimatic. From the app’s website:

Even with the amazing technology we have in our pockets, we can fly through the day without remembering to send a simple “I love you” to the most important person in our lives.

Romantimatic can help.

It can help by automatically reminding you to contact the one you love and providing some helpful pre-set messages to save you the trouble of actually coming up with something to say.

Selinger has his reservations about this sort of “outsourced sentiment,” and he irenically considers the case Romantimatic’s creator makes for his app while exploring the difference between the legitimate use “social training wheels” and the outsourcing of moral and emotional responsibility. I encourage you to read the whole thing.

“What’s really weird,” Selinger concludes, “is that Romantimatic style romance may be a small sign of more ambitious digital outsourcing to come.”

That is exactly right. Increasingly, we are able to outsource what we might think of as ethical and emotional labor to our devices and apps. But should we? I’m sure there are many for whom the answer is a resounding Yes. Why not? To be human is to make use of technological enhancements. Much of our emotional life is already technologically mediated anyway. And so on.

Others, however, might instinctively sense that the answer, at least sometimes, is No. But why exactly? Formulating a cogent and compelling response to that question might take a little work. Here, at least, is a start.

The problem, I think, involves a conflation of intellectual labor with ethical/emotional labor. For better and for worse, we’ve gotten used to the idea of outsourcing intellectual labor to our devices. Take memory, for instance. We’ve long since ceased memorizing phone numbers. Why bother when our phones can store those numbers for us? On a rather narrow and instrumental view of intellectual labor, I can see why few would take issue with it. As long as we find the solution or solve the problem, it seems not to matter how the labor is allocated between minds and machines. To borrow an old distinction, the labor itself seems accidental rather than essential to the goods sought by intellectual labor.

When it comes to our emotional and ethical lives, however, that seems not to be the case. When we think of ethical and emotional labor, it’s harder to separate the labor itself from the good that is sought or the end that is pursued.

For example, someone who pays another person to perform acts of charity on their behalf has undermined part of what might make such acts virtuous. An objective outcome may have been achieved, but at the expense of the subjective experience that would constitute the action as ethically virtuous. In fact, subjective experience, generally speaking, is what we seem to be increasingly tempted to outsource  When it comes to our ethical and emotional lives, however, the labor is essential rather than accidental; it cannot be outsourced without undermining the whole project. The value is in the labor, and so is our humanity.

____________________________________________

Further Reading

Selinger has been covering this field for awhile; here is a related essay.

I touched on some of these issues here.