The best kind of blog posts and essays are the ones that make you think, and that was exactly the effect of PJ Rey’s smart post on Cyborgology, “Trust and Complex Technology: The Cyborg’s Modern Bargain.” After doing some of that thinking, I offered some on-site comments. Because they are related to previous posts on this site (here, here and here), I’m going to also post them here. Before reading them, I recommend you click over to Rey’s post and read the whole thing which would only take a few minutes.
In case, you’re in a hurry, here’s Rey’s argument in brief, as I understand it: We live in a world of increasingly complex technologies, the workings of which the average person is, for the most part, unable to understand. Because we nonetheless use these technologies, often without so much as a second thought, this means that we are trustingly committed to these technologies and the expert systems that generate them. This trust, which presupposes a lack of self-sufficiency, is a fundamentally social dynamic.
In light of that argument, here are my comments, edited somewhat so as to make better sense as a stand alone statement:
I’m not sure how well this describes the actual existential experience of modern technology. Mostly, I wonder if there is not a distinction between implicit and explicit trust when we talk about the sociability of technology use. In other words, I’m not sure if the users of complex technologies such as the iPhone or social media platforms are consciously deploying what we might meaningfully call “trust” in their use of these technologies and platforms.
We could argue that their use implies a kind of de facto trust in the equipment, but does this amount to the kind of trust that operates in truly social relationships? Trusting systems and trusting persons seem to me to be two different phenomenon. And while it’s possible to argue that people set up and run systems so that finally we’re trusting in them, but I would suggest that the system, if it is running well, practically obscures the human element. So if we’ve outsourced our trust as it were to expert systems, is it still trust that we are talking about or habituated responses to (mostly) predictable systems?
Historians of technology (and Arthur C. Clarke) have pointed to an affinity between magic and technology that can be traced back to the early modern era. I would suggest that our existential experience of using an iPad, for example, more readily approximates our experience of magic than it does trust among social relations. As I write this it occurs to me that perhaps the analysis assumes the other half of the atomized (political) individual that is being criticized, the rational (economic) actor whose decision making is a fully conscious, calculated affair. In other words, I suspect most people are not consciously rationalizing their use of technologies by invoking trust in the persons who stand behind the technologies, rather they use the technology as if it were a magical tool that “just worked” — Apple products, I think, are paradigmatic here. And, if the trust in persons is not explicit, can it meaningfully be called social?
One last thought. Philosopher Albert Borgmann’s “device paradigm” comes to mind here. In brief, he argues that our tools are increasingly offering ease of use, effectiveness, and efficiency but at the expense of becoming increasingly opaque to the average user — much the same dynamic that you describe. But this opacity, in his view, has the effect of alienating us from the technology, and to some degree from the experiences mediated by those technologies. In part, I suspect, this is precisely because we are made to use that which we don’t even remotely understand, and this is not really trust, but a kind of gamble. Trust, in social relations, is not quite blind faith; it is risky, but not, in most cases, a gamble. Coming back to the airplane, for many people, statistics not withstanding, boarding an airplane feels like a gamble and involves very little trust at all.
To wrap up, I’m suggesting that we need to phenomenologically parse out “trust”: reasoned, explicit trust; reasonable, implicit trust; unreasonable, implicit trust; unreasonable, explicit trust (blind faith); explicitly untrusting acquiescence; gambles; etc. Our use of technology relies on many of these, but not all of them amount to the kind of trust that might render us meaningfully social.
There you have it; comments, as always, welcome.
Update: The conversation continues at Cyborgology. Click through to read Rey’s response, etc.