Thinking about the social and personal consequences of technology often leads to a debate between those who believe technologies are more or less neutral and those who believe technologies exert some kind of formative influence over human actions. The latter view I’ve taken to calling technological voluntarism, and the former is typically identified as some variety of technological determinism. On the one hand, it seems obvious that choices are being made by those who use technology, and those choices could reasonably be made to the contrary. On the other hand, an influence is felt, at the individual and societal level, which cannot easily be discussed without assigning, at least rhetorically, some causal power to technology. If it is difficult to argue that technology wholly determines our situation, it would also be difficult to argue that technology does not at all condition our situation. The challenge is to characterize the nature of this non-determinative, yet formative influence.
Here is one approach to this discussion that I would like to commend: evolving mutual reciprocity. The emergence and adoption of technology are a function of human agency, the ability to choose how technology is to be used, but human agency is itself conditioned by our prior use of technology. This approach grants primacy neither to the tool nor to the act of choosing. Our choosing is always already conditioned by our tools, and this conditioning is always already a consequence of our choices. But in order to advance this argument it’s necessary to conceptualize the manner in which this reciprocal state of affairs is actualized. To do this I’m going to borrow from an Aristotelian account of moral formation with particular emphasis on the embodied, and thus pre-cognitive, dimension of human action.
In Aristotle’s Nicomachean Ethics, choice is the foundation of the moral life: “Again, we feel anger and fear without choice, but the excellences [or, virtues] are choices or involve choices.” Choices, however, eventually become habits, and habits dispose us to choose in certain ways and not others. Notice the reciprocity between doing and being in Aristotle’s account of courage: “by being habituated to despise things that are terrible and to stand our ground against them we become brave, and it is when we have become so that we shall be most able to stand our ground against them.” Act bravely to become brave that you may act bravely. Aristotle illustrates his point by comparing the cultivation of virtue to the cultivation of skill at playing the lyre or the work of building. Skill at living well is acquired analogously to these other practical skills — it is acquired by embodied practice. So, Aristotle explains,
by doing the acts that we do in our transactions with other men we become just or unjust, and by doing the acts that we do in the presence of danger, and being habituated to feel fear or confidence we become brave or cowardly…. Thus, in one word, states arise out of like activities. This is why the activities we exhibit must be of a certain kind; it is because the states correspond to the difference between these. It makes no small difference, then, whether we form habits of one kind or of another from our very youth; it makes a very great difference, or rather all the difference.
There are two directions in which we can apply this insight to the question of technology and human agency. First, it suggests that the choices we make with our tools are initially experienced as choices, but in time take on the force of habits. These habits, if sufficiently ingrained, then act as would virtues or vices in Arsitotle’s schema, i.e., they condition subsequent choices. If we’re inattentive to the force of habituated action, we may be unable to fully account for the influence of technology, individually or socially.
The second direction follows Aristotle’s own examples and emphasizes the embodied dimension of habituated action. Lived experience consists of a circuit comprising mind, body, tools, and world. This circuit of perception and action typically runs so smoothly through these nodes that it may hardly be noticed at all. In fact, the tendency would be to lose sight of how deeply integrated into the experience of reality tools have become and how these tools mediate reality. To put it in a slightly different way, tools become the interface through which reality is accessed. (Putting it this way also illustrates how tools provide the metaphors by which reality is interpreted.) Katherine Hayles drew attention to this circuit when, discussing the significance of embodiment, she wrote,
When changes in [embodied] practices take place, they are often linked with new technologies that affect how people use their bodies and experience space and time. Formed by technology at the same time that it creates technology, embodiment mediates between technology and discourse by creating new experiential frameworks that serve as boundary markers for the creation of corresponding discursive systems.
New technologies, in other words, produce novel ways of using and experiencing bodies in the world. With our bodies we make our tools and our tools then shape how we understand and experience our bodies.
This often unnoticed circuit through which we experience the world is sometimes disrupted by some error in the code of digital devices or breakdown of machinery. We typically take these sorts of disruptions as annoyances of varying degrees; but because tools are an unnoticed link in the circuit encompassing world, body, and mind, disruptions emanating from the tools also elicit flashes of illumination by breaking habituated patterns of thought and action. Let’s call this the Empty Milk Jug Effect. When you pick up an empty milk jug that you think is full, you’re caught off guard; you experience a palpable rupture between unconscious, embodied judgments and the feedback flowing back through the embodied instantiation of those judgments. Likewise, the malfunction of our tools may elicit similar instances of startled realization with regard to the countless pre-cognitive and habituated dispositions and assumptions that facilitate our experience. Thus Hayles again:
. . . unpredictable breaks occur that disrupt the smooth functioning of thought, action, and result, making us abruptly aware that our agency is increasingly enmeshed within complex networks extending beyond our ken and operating through codes that are, for the most part, invisible and inaccessible.
While not referencing Aristotle, Hayles also employs the language of habit. She writes, for example, of bodily practices which have sedimented
“into habitual actions and movements, sinking below conscious awareness. At this level they achieve an inertia that can prove surprisingly resistant to conscious intentions to modify or change them. By their nature, habits do not occupy conscious thought; they are done more or less automatically, as if the knowledge of how to perform the actions resided in ones’ fingers or physical mobility rather than in one’s mind.”
An example: I recently switched to a MacBook Pro after years of using a variety of PCs. After a couple of weeks of using my new Mac I had become fairly well accustomed to the interface. When I went back to my PC to access some old files, I found myself making Apple gestures on the PC trackpad that I knew, had you asked me, would not work on the PC. But my fingers had already learned certain habits and sought to apply them. That is a seemingly insignificant illustration, but consider the implications if similar patterns of habituated expectation and action were consistently realized throughout the whole range of our technologically mediated experiences. I imagine, for example, that if we were to perform a careful case study of the embodied habits that have accumulated around our use of cell phones, we would come away with a string of other, more significant, examples of our technologically conditioned habits of being in the world and with others.
Returning to the question of human agency and technology equipped with the categories of habit and embodiment, a mediating position that transcends the impasse between voluntarism and determinism emerges. Technology does not achieve its influence apart from the countless choices to use technology in this way or that, and those choices to use technology are never free of the earlier habits acquired by the use of technology.
Technologies do not change the character of their age merely by their appearance, they do so through the use to which they are put by individuals whose perceptions, assumptions, and sensibilities are thereby re-ordered and re-calibrated. When this use becomes habitual, the new perceptions, assumptions, and sensibilities achieve a taken-for-granted status and become, as it were, a second nature. This technologically induced “second-nature” then becomes the ground for the subsequent inter-play between human agents and new technologies.
If we’re mindful of the manner in which technology exerts its influence, we may have a chance to address whatever we take to be the less desirable consequences of technology, at least on a personal level. We do well to remember Aristotle’s counsel: those who would transform their character by “taking refuge in theory” are like “patients who listen attentively to their doctors, but do none of the things they are ordered to do.” If character is formed, at least in part, by technological habits, then the use of technology must be calibrated by practices and counter-practices that will yield virtue. Merely thinking about how we would like to change won’t get us very far at all.