Fit the Tool to the Person, Not the Person to the Tool

I recently had a conversation with a student about the ethical quandaries raised by the advent of self-driving cars. Hypothetically, for instance, how would a self-driving car react to a pedestrian who stepped out in front of it? Whose safety would it be programmed to privilege?

The relatively tech-savvy student was unfazed. Obviously this would only be a problem until pedestrians were forced out of the picture. He took it for granted that the recalcitrant human element would be eliminated as a matter of course in order to perfect the technological system. I don’t think he took this to be a “good” solution, but he intuited the sad truth that we are more likely to bend the person to fit the technological system than to design the system to fit the person.

Not too long ago, I made a similar observation:

… any system that encourages machine-like behavior from its human components, is a system poised to eventually eliminate the human element altogether. To give it another turn, we might frame it as a paradox of complexity. As human beings create powerful and complex technologies, they must design complex systemic environments to ensure their safe operation. These environments sustain further complexity by disciplining human actors to abide by the necessary parameters. Complexity is achieved by reducing human action to the patterns of the system; consequently, there comes a point when further complexity can only be achieved by discarding the human element altogether. When we design systems that work best the more machine-like we become, we shouldn’t be surprised when the machines ultimately render us superfluous.

A few days ago, Elon Musk put it all very plainly:

“Tesla co-founder and CEO Elon Musk believes that cars you can control will eventually be outlawed in favor of ones that are controlled by robots. The simple explanation: Musk believes computers will do a much better job than us to the point where, statistically, humans would be a liability on roadways [….] Musk said that the obvious move is to outlaw driving cars. ‘It’s too dangerous,’ Musk said. ‘You can’t have a person driving a two-ton death machine.'”

Mind you, such a development, were it to transpire, would be quite a boon for the owner of a company working on self-driving cars. And we should also bear in mind Dale Carrico’s admonition “to consider what these nonsense predictions symptomize in the way of present fears and desires and to consider what present constituencies stand to benefit from the threats and promises these predictions imply.”

If autonomous cars become the norm and transportation systems are designed to accommodate their needs, it will not have happened because of some force inherent in the technology itself. It will happen because interested parties will make it happen, with varying degrees of acquiescence from the general public.

This was precisely the case with the emergence of the modern highway system that we take for granted. Its development was not a foregone conclusion. It was heavily promoted by government and industry. As Walter Lippmann observed during the 1939 World’s Fair, “General motors has spent a small fortune to convince the american public that if it wishes to enjoy the full benefit of private enterprise in motor manufacturing, it will have to rebuild its cities and its highways by public enterprise.”

Consider as well the film below produced by Dow Chemicals in support of the 1956 Federal Aid-Highway Act:

Whatever you think about the virtues or vices of the highway system and a transportation system designed premised on the primacy the automobile, my point is that such a system did not emerge in a cultural or political vacuum. Choices were made; political will was exerted; money was spent. So it is now, and so it will be tomorrow.

When the Problem is Technology, the Answer is More Technology

In 1954, Martin Heidegger published “The Question Concerning Technology” in which he observed,

“[T]he instrumental conception of technology conditions every attempt to bring man into the right relation to technology. Everything depends on our manipulating technology in the proper manner as a means. We will, as we say, ‘get’ technology ‘spiritually in hand’. We will master it. The will to mastery becomes all the more urgent the more technology threatens to slip from our control.”

Not sure what Heidegger meant? Josh Cohen provides a contemporary illustration:

“Wang is the co-founder of Lumo BodyTech, a company that produces pioneering devices designed to enhance a user’s posture. Their lead product is the LUMOBack Posture Sensor, which triggers warning vibrations the moment you slouch. Given that poor posture is a key symptom of compulsive absorption in our laptops and phones, this product is not merely a physical corrective, says Wang, but the harbinger of a new ‘mindfulness’, a means of awakening the self from its high-tech slumber.

So, our mortally anxious distraction by tracking devices is to be finally arrested by…a tracking device. You can only be struck at this point by Wang’s genial indifference to what he’s actually saying. Self-tracking, he declares at a conference promoting the practice, corrodes social and emotional ties, engenders helpless dependence on technology and endangers physical health. But thank goodness I’ve patented a new self-tracking device, he concludes, impervious to either the irony in his catastrophic diagnosis of collective technological alienation and his proposed remedy of a posture sensor.”

 

To See, or To Be Seen

When we think about the consequences of a new technology, we are prone to ask about what can be done with it. We think, in other words, of the technology as a tool which is put to this use or that. We then ask whether that use is good or bad, or possibly indifferent.

So take, for example, a relatively new product like Twitter’s Vine. Vine is an app that is to video what Twitter is to text. It invites you to record and post videos, but these videos can be no more the six seconds in length. If you’re unfamiliar with Vine, you can watch a seemingly random selection of new videos at Vinepeek.

vine-logo

When you first hear of Vine, and you think about evaluating it (because you happen to be in a critical frame of mind), what sorts of questions do you ask? I suspect the first question that will typically come to mind is this: What will people post with this new tool? Will the videos be touching, beautiful, surprising, revealing? Will they be trashy, abusive, pornographic, violent? Or, will they be inane, predictable, mind-numbing?

Of course, videos posted to Vine will likely be all of these things in some necessarily depressing proportion. But this is only the first and most obvious question one could ask.

Here is another possible question: How does the use of Vine shape the way one perceives experience?

There are other questions, of course, but it is this question of perception I find to be really fascinating. The use of technology leads to consequential actions out there, in the world. But the use of technology also carries important consequences in here, within me. The question of perception is especially important for two reasons. First, and most obviously, our perception is the ground of pretty much everything else we do. How we “see” things leads to certain kinds of thoughts and feelings and actions. Secondly, that by which we perceive tends to fade from view; we don’t, to take the most obvious example, see the eyes through which we see everything else.

This means that one of the most important consequences of a new technology might also be the consequence we are least likely to become aware of, and this only heightens its influence.

So how does the use of Vine shape perception? Like most documentary technology, the use of Vine will likely encourage users to “see” potential Vines in their experience just as a camera encourages users to “see” potential photographs. But what do we make of this new frame by which we are prompted perceive? That depends, I think, on the degree to which users become self-aware of the medium, the possibilities it creates, and the constraints it imposes.

Reality is always out there; certain aspects are apparent to us, certain aspects are concealed. New technologies may reconfigure what is revealed and what is concealed to us. Slow-motion film, for instance, does not create a new reality; it alters perception and thereby reveals previously concealed dimensions of reality. (I think this is the sort of thing Walter Benjamin had in mind when he discussed the “optical unconscious.”)

Technologies of perception — and really all technologies impact perception — reveal and conceal. No one technology can simultaneously reveal the whole of reality. If it reveals some new dimension of reality, it is because it simultaneously conceals some other dimension. A user that is self-conscious of how a new technology can be used to perceive experience creatively might use a new medium such as Vine to imaginatively make others aware of some previously unnoticed aspect of reality.

To those who care about such things, I think that Martin Heidegger’s influence is hiding between the lines of this post. The German philosopher had a great deal to say about technology and how it affects our perception, how it becomes a part of us. He used the phrase “standing reserve” to describe how modern technology encouraged us to reduce material reality to a fund of resources just there on stand-by, “ready-to-hand,” that is ready to be put to use by us for our purposes. The intrinsic properties of what is rendered merely standing reserve are obscured or lost altogether. We see, we perceive such things only as they are useful to us. We don’t see such things as they are; and “such things” are sometimes not “things,” but persons.

With a technology like Vine, the question may be whether it is used with a view  to the world as “standing-reserve,” there merely to be exploited for our own uses (which often amount to making ourselves seen), or whether it is used as a means of revealing the world, of allowing some previously muted aspect of reality to be seen.

Of course, this question applies to much more than Vine. It is a question to ask of all technology.