Writing about “technology and the moral dimension,” tech writer and Gigaom founder, Om Malik made the following observation:
“I can safely say that we in tech don’t understand the emotional aspect of our work, just as we don’t understand the moral imperative of what we do. It is not that all players are bad; it is just not part of the thinking process the way, say, ‘minimum viable product’ or ‘growth hacking’ are.”
I’m not sure how many people in the tech industry would concur with Malik’s claim, but it is a remarkably telling admission from at least one well-placed individual. Happily, Malik realizes that “it is time to add an emotional and moral dimension to products.” But what exactly does it mean to add an emotional and moral dimension to products?
Malik’s own ensuing discussion is brief and deals chiefly with using data ethically and producing clear, straightforward terms of service. This suggests that Malik is mostly encouraging tech companies to treat their customers in an ethically responsible manner. If so, it’s rather disconcerting that Malik takes this to be a discovery that he feels compelled to announce, prophetically, to his colleagues. Leaving that unfortunate indictment of the tech community aside, I want to suggest that there is no need to add a moral dimension to technology.
Years ago, Langdon Winner famously asked, “Do artifacts have politics?” In the article that bears that title, Winner went on to argue that they most certainly do. We might also ask, “Do artifacts have ethics?” I would argue that they do indeed. The question is not whether technology has a moral dimension, the question is whether we recognize it or not. In fact, technology’s moral dimension is inescapable, layered, and multi-faceted.
When we do think about technology’s moral implications, we tend to think about what we do with a given technology. We might call this the “guns don’t kill people, people kill people” approach to the ethics of technology. What matters most about a technology on this view is the use to which it is put. This is, of course, a valid consideration. A hammer may indeed be used to either build a house or bash someones head in. On this view, technology is morally neutral and the only morally relevant question is this: What will I do with this tool?
But is this really the only morally relevant question one could ask? For instance, pursuing the example of the hammer, might I not also ask how having the hammer in hand encourages me to perceive the world around me? Or, what feelings having a hammer in hand arouses?
Below are a few other questions that we might ask in order to get at the wide-ranging “moral dimension” of our technologies. There are, of course, many others that we could ask, but this is a start.
- What sort of person will the use of this technology make of me?
- What habits will the use of this technology instill?
- How will the use of this technology affect my experience of time?
- How will the use of this technology affect my experience of place?
- How will the use of this technology affect how I relate to other people?
- How will the use of this technology affect how I relate to the world around me?
- What practices will the use of this technology cultivate?
- What practices will the use of this technology displace?
- What will the use of this technology encourage me to notice?
- What will the use of this technology encourage me to ignore?
- What was required of other human beings so that I might be able to use this technology?
- What was required of other creatures so that I might be able to use this technology?
- What was required of the earth so that I might be able to use this technology?
- Does the use of this technology bring me joy?
- Does the use of this technology arouse anxiety?
- How does this technology empower me? At whose expense?
- What feelings does the use of this technology generate in me toward others?
- Can I imagine living without this technology? Why, or why not?
- How does this technology encourage me to allocate my time?
- Could the resources used to acquire and use this technology be better deployed?
- Does this technology automate or outsource labor or responsibilities that are morally essential?
- What desires does the use of this technology generate?
- What desires does the use of this technology dissipate?
- What possibilities for action does this technology present? Is it good that these actions are now possible?
- What possibilities for action does this technology foreclose? Is it good that these actions are no longer possible?
- How does the use of this technology shape my vision of a good life?
- What limits does the use of this technology impose upon me?
- What limits does my use of this technology impose upon others?
- What does my use of this technology require of others who would (or must) interact with me?
- What assumptions about the world does the use of this technology tacitly encourage?
- What knowledge has the use of this technology disclosed to me about myself?
- What knowledge has the use of this technology disclosed to me about others? Is it good to have this knowledge?
- What are the potential harms to myself, others, or the world that might result from my use of this technology?
- Upon what systems, technical or human, does my use of this technology depend? Are these systems just?
- Does my use of this technology encourage me to view others as a means to an end?
- Does using this technology require me to think more or less?
- What would the world be like if everyone used this technology exactly as I use it?
- What risks will my use of this technology entail for others? Have they consented?
- Can the consequences of my use of this technology be undone? Can I live with those consequences?
- Does my use of this technology make it easier to live as if I had no responsibilities toward my neighbor?
- Can I be held responsible for the actions which this technology empowers? Would I feel better if I couldn’t?
You can subscribe to The Convivial Society, my newsletter on tech, society, and the good life, here.
42 thoughts on “Do Artifacts Have Ethics?”
I could never have come up with all those questions, but they certainly should be answered. Another thought-provoking post!
Feliz natal são os votos de
I do think companies need to be considering morals and ethics, but I think your provocation is incredibly important as well. These questions are so important to be asking. I am a teacher/teacher educator, and as I become more and more involved in the edtech dialogue, the more I question every part of it. There are amazing products and technological tools out there. There are also SO many that are very clearly intended to make money by marketing products and characters. Many companies are making things “kids like”, marketed as “educational” without a solid background in learning theory. This problem is certainly not limited to tech, but I do think it’s particularly hard for parents and teachers to be thoughtful consumers of tech because of sheer market saturation and lack of solid information with which to analyze the value of the products. There are amazing open-ended technological tools for education out there…and it just so happens that many of them are not the ones being created and marketed for kids. Teachers and parents need to be engaged in this discussion, and know how to facilitate it with children. This post provides such a strong place to start!
Powerful questions! Great post.
In addition to people asking these questions themselves, or “society” asking these questions with regard to some technology, it would be great if contemplation of these kinds of ethical questions also went into the education of those creating the technology. But perhaps it does need to come from the outside first in which people stop using services that go against their values.
I’ve been pleasantly surprised by the number of people involved in creating technology, mostly digital tools, who have been tweeting this post.
Good to hear!
The other thing that comes to mind in thinking about these questions is that it may well be that one cannot really answer them ahead of time. It may be that one just doesn’t know how people will really be affected by using some new tool. It may be hard to foresee how one’s sense of space and time will really be changed or what the effects on friendships may be … etc. But in retrospect, we can see the big changes, some of which may be beneficial, some of which not.
So we should figure out ways to be able to try things out but not treat them as inevitable in case people decide they don’t like the results.
This reminds me of Shannon Vallor’s module “An Introduction to Software Engineering Ethics.”
This is reading I recommend to devs all the time.
Excellent. Thanks for the link.
Oque, Feliz natal são os votos de
Excellent list of questions, Michael. See also, by way of comparison, though I’m sure you’ve come across them before, Neil Postman’s 6 questions to ask of new technologies and Jacques Ellul’s 76 (!) reasonable questions to ask of any technology.
I’d forgotten about Postman’s questions. Thanks for the reminder!
Wow! Awesome questions. I think we should have a doc filled with these questions and answering those will bring up a bring vision for your product.
It brings to mind an ambiguous episode in our nation’s history. When the flash of the Hiroshima explosion illuminated the world, it threw into stark silhouette this very question. For the imperative to produce the atomic bomb before the Germans did justified the Manhattan Project to avoid the 1000 year Reich was no longer a pressure as the Allied victory in Europe was complete. One may weigh, as many have, the number of Allied lives saved by shortening the war with Japan. But the choice of how to demonstrate this awesome device was open. At least for a few more months or years, until Iran deploys its weapons on Israel to demonstrate its primacy in the region, the U.S. is still the only country to have used this weapon.
What does that say about our moral position in the world?
Truman might have done well to remember the words in Shakespeare’s Scottish Play after MacBeth returns from fields of honour as did the U.S. from our part in the European victory:
We will proceed no further in this business*:
He hath honour’d me of late; and I have bought
Golden opinions from all sorts of people,
Which would be worn now in their newest gloss,
Not cast aside so soon.
*(by “business” we may read employment of the A-bomb on the innocent for Duncan, too, was welcomed to MacBeth’s castle and accepted in all innocence)
Great post! If only these questions were asked in practice
This is an excellent practical guide for thinking morally about technology. However, I’m always trying to imagine the view from person unconcerned or uninitiated in critical approaches to technology, and I can imagine very quick, cynical or frustrated answers to most of these questions. “No one has time for this”. “The trade-offs are obviously worth it because of the Progress we’ve made”, etc., etc. It seems to me that if I react dismissively toward these reactions, I am guilty of the same faults. Do you have any thoughts on circumventing these types of responses?
Excellent question. In fact, I may devote a blog post to it soon.
I would never came up with all those questions!. Though is very very interesting the approach you’re tanking here. Congrats for such a good article
A fine essay, thank you. Many of the questions in your list remind me of questions asked by the Amish community when deciding what technologies to adopt, and how. The Amish don’t categorically reject technology, they carefully consider, debate – and reconsider – how specific uses of technologies help or hurt their community based on their religous values.
“The difference between Amish people and most other Americans is the deliberation that takes place before deciding whether to embrace a new technology. Many Americans assume newer technology is always better, and perhaps even inherently good.
“The Amish don’t buy that,” says Donald Kraybill, professor at Elizabethtown College and co-author of The Amish. “They’re more cautious — more suspicious — wondering is this going to be helpful or is it going to be detrimental? Is it going to bolster our life together, as a community, or is it going to somehow tear it down?”
Another good on Amish people and technlogy (recently republished in Slate):
This includes a link to Jameson Wetmore’s 2007 IEEE Technology & Society article (pdf)
Click to access Wetmore-AmishTechnology-v2.pdf
I agree entirely. In fact, I wrote a post to that effect a couple of years ago, The Tech-Savvy Amish: https://thefrailestthing.com/2012/08/04/the-tech-savvy-amish/
Thanks for these links!
Reblogged this on Rob Scott's Blog and commented:
I came across this blog by @FrailestThing and immediately realized the importance of the questions he has raised. They are very thought provoking and challenge your personal and societal views.
Reblogged this on hrtechgirl.
When we ask these questions, we won’t always like the answer. Just give it an honest try for your smartphone use. However, having asked these questions, and being more aware of some of the more negative aspects of the ‘connected technologies’ we are introducing in our and each-others’ lives at such a fast pace.
I loved your writing.
energy follows intention .. so simple
Reblogged this on syndax vuzz.
Building or creating technology is engineering product or solutions to questions and/or needs. The asking of why and what other possible uses this technology could have after the invention is too late. The examination should be of the motivation.
Incorporating an aspect of the development process devoted to assessing some of the questions posed in this list could be beneficial. I believe this could serve as a reminder to creators that their technologies have the ability to impact more than the initial intent.