If you’re not paying attention to Evan Selinger’s work, you’re missing out on some of the best available commentary on the ethical implications of contemporary technology. Last week I pointed you to his recent essay, “The Outsourced Lover,” on a morally questionable app designed to automate romantic messages to your significant other. In a more recent editorial at Wired, “Today’s Apps Are Turning Us Into Sociopaths,” Selinger provides another incisive critique of an app that similarly automates aspects of interpersonal relationships.
Selinger approached his piece by interviewing the app designers in order to understand the rationale behind their product. This leads into an interesting and broad discussion about technological determinism, technology’s relationship to society, and ethics.
I was particularly intrigued by how assumptions of technological inevitability were deployed. Take the following, for example:
“Embracing this inevitability, the makers of BroApp argue that ‘The pace of technological change is past the point where it’s possible for us to reject it!’”
“’If there is a niche to be filled: i.e. automated relationship helpers, then entrepreneurs will act to fill that niche. The combinatorial explosion of millions of entrepreneurs working with accessible technologies ensures this outcome. Regardless of moral ambiguity or societal push-back, if people find a technology useful, it will be developed and adopted.’”
It seems that these designers have a pretty bad case of the Borg Complex, my name for the rhetoric of technological determinism. Recourse to the language of inevitability is the defining symptom of a Borg Complex, but it is not the only one exhibited in this case.
According to Selinger, they also deploy another recurring trope: the dismissal of what are derisively called “moral panics” based on the conclusion that they amount to so many cases of Chicken Little, and the sky never falls. This is an example of another Borg Complex symptom: “Refers to historical antecedents solely to dismiss present concerns.” You can read my thoughts on that sort of reasoning here.
Do read the whole of Selinger’s essay. He’s identified an important area of concern, the increasing ease with which we may outsource ethical and emotional labor to our digital devices, and he is helping us think clearly and wisely about it.
About a year ago, Evgeny Morozov raised related concerns that prompted me to write about the inhumanity of smart technology. A touch of hyperbole, perhaps, but I do think the stakes are high. I’ll leave you with two points drawn from that older post.
“Out of the crooked timber of humanity no straight thing was ever made,” Kant observed. Corollary to keep in mind: If a straight thing is made, it will be because humanity has been stripped out of it.
The second relates to a distinction Albert Borgmann drew some time ago between troubles we accept in practice and those we accept in principle. Those we accept in practice are troubles we need to cope with but which we should seek to eradicate, take cancer for instance. Troubles we accept in principle are those that we should not, even if we were able, seek to abolish. These troubles are somehow essential to the full experience of our humanity and they are an irreducible component of those practices which bring us deep joy and satisfaction.
That’s a very short summary of a very substantial theory. You can read more about it in that earlier post and in this one as well. I think Borgmann’s point is critical. It applies neatly to the apps Selinger has been analyzing. It also speaks to the temptations of smart technology highlighted by Morozov, who rightly noted,
“There are many contexts in which smart technologies are unambiguously useful and even lifesaving. Smart belts that monitor the balance of the elderly and smart carpets that detect falls seem to fall in this category. The problem with many smart technologies is that their designers, in the quest to root out the imperfections of the human condition, seldom stop to ask how much frustration, failure and regret is required for happiness and achievement to retain any meaning.”
From another angle, we can understand the problem as a misconstrual of the relationship between means and ends. Technology, when it becomes something more than an assortment of tools, when it becomes a way of looking at the world, technique in Jacques Ellul’s sense, fixates on means at the expense of ends. Technology is about how things get done, not what ought to get done or why. Consequently, we are tempted to misconstrue means as ends in themselves, and we are also encouraged to think of means as essentially interchangeable. We simply pursue the most efficient, effective means. Period.
But means are not always interchangeable. Some means are integrally related to the ends that they aim for. Altering the means undermines the end. The apps under consideration, and many of our digital tools more generally, proceed on the assumption that means are, in fact, interchangeable. It doesn’t matter whether you took the time to write out a message to your loved one or whether it was an automated app that only presents itself as you. So long as the end of getting your loved one a message is accomplished, the means matter not.
This logic is flawed precisely because it mistakes a means for an end and sees means as interchangeable. The real end, of course, in this case anyway, is a loving relationship not simply getting a message that fosters the appearance of a loving relationship. And the means toward that end are not easily interchangeable. The labor, or, to use Borgmann’s phrasing, the trouble required by the fitting means cannot be outsources or eliminated without fatally undermining the goal of a loving relationship.
That same logic plays out across countless cases where a device promises to save us or unburden us from moral and emotional troubles. It is a dehumanizing logic.
3 thoughts on “Troubles We Must Not Refuse”
Reblogged this on Whose to say. and commented:
Intriguing concept on today’s technology. The emphasis of this technology caters to like-mind people who fail to retain any depth and thus become sociopaths with the assistance of accessible software APPs.