A little while ago, I cited a couple of passages from Romano Guardini’s Letters from Lake Como: Explorations in Technology and the Human Race on the theme of consciousness. Guardian was a Catholic philosopher and theologian active during the first half of the twentieth century. Although largely forgotten today, he was widely known in his day and left his mark on the thinking of several better-remembered contemporaries, including Hannah Arendt, who sat under Guardini’s teaching in her undergraduate years. Guardini’s work was also prominently cited in Pope Francis’ 2015 encyclical, Laudato Si.
At some point in the near future I may have more to say about Guardini and his reflections on technology in the Letters, which were originally during 1924 and 1925. Here I only want to draw your attention to parts of a 1959 address Guardini delivered to the Munich College of Technology, an address which is now included with the translation I’m reading.
It was especially interesting to read Guardini’s address to these technologists-in-training in light of the recent burst of frustration with Silicon Valley and, more broadly, with technologists and technologies that increasingly disorder our private and public experience.
At the outset, Guardini acknowledges that he does not have much to offer by way of technical know-how, so his theme would not be “the actual structure and work of machines,” but “what they mean for human existence, or more precisely, how their construction and use affect humanity as a living totality.”
In many respects, Guardini was about as friendly a critic as these technologists could’ve hoped for. What he has to say “will have the character of an existential problem, and it will thus necessarily reflect concern.” He adds that he “will have to consider primarily the negative element in the phenomenon of machines,” but he insists that they are “to see here neither the pessimism that we often sense in current cultural criticism nor the resentment that comes with the end of an epoch against the new thing that is pushing out the old.”
I think Guardini was in earnest about this. This same attitude was ultimately borne out by this Letters written nearly forty years earlier. “The concern I want to express,” he tells them, “is the positive one whether the process of technology worldwide will really achieve the great things that it can and should.”
“A healthy optimism,” he writes, “is undoubtedly part of all forceful action, but so, too, is a sense of responsibility for this action.”
Having made these preliminary comments, which, again, set about as irenic a tone as could be expected, Guardini briefly laid out a taxonomy of technology that included tools, contrivances, and machines. He goes on to explore the human consequences of machines, chiefly focusing on the power machines granted and the ethical responsibility this entailed.
“To gain power is to experience it as it lays claim to our mind, spirit, and disposition,” Guardini claimed. “If we have power, we have to use it, and that involves conditions. We have to use it with responsibility, and that involves an ethical problem.”
“Thus dangers of the most diverse kind arise out of the power that machines give,” he elaborated. “Physically one human group subjugates another in open or concealed conflict. Mentally and spiritually the thinking and feelings of the one influence the other.”
Interestingly, Guardini noted that in order to assume ethical responsibility for our machines it must be presupposed “that we freely stand over against machines even as we use them, that we experience and treat them as something for whose operation we have to set the standards.”
“But do we do that?” Guardini wondered. “Does any such ethos exist? That remains to be seen. It is a disturbing fact that people often see the attempt to relate to machines in this way as romantic. As a rule today people find in machines and their working given realities that we cannot alter in any way.”
Near the end of the talk, Guardini cites an example from a story that had appeared in Frankfurter Allgemeine Zeitung, which, he adds, “is certainly not against technology.”
The example is of how “machines spur us to go into areas where personal restraint would forbid us to intrude.” The article, he goes on to explain, “shows us in sharp detail what is the issue here—namely, the possibility of committing people without their even being aware of it. But that involves a basic threat to something that is essential in all human dealings—namely, trust.”
“The possibility of committing people” is an interesting phrase. It’s translated from German and I have no idea what the underlying German word might be nor could I make much of it if I did. I take Guardini to mean something like compromising someone without their consent or somehow gaining some advantage over them without their awareness. This seems to fit with the case Guardini goes on to describe.
As Guardini summarizes it, the article discusses the commercial availability of something like a Dick Tracy-style watch that can surreptitiously record conversations. In the article, a salesperson was asked “whether people might be bothered by them and would want them.” The response was straightforward: “Why should they bother us?” Moreover, people were already buying them, the salesman added. “We cannot prevent this,” the reporter added, “but we are permitted to say: ‘Shame on you, devil.'”
Guardini went on: “The reporter said, ‘Shame on you, devil,’ giving evidence of ethical judgment in the matter. But most people seem not to have such judgment. At issue here is not a romantic fear of machines but the fact that power is impinging on something that ought not to be challenged if the very essence of our humanity is to remain unthreatened.”
What struck me here was how Guardini’s concerns paralleled those being articulated today about digital media. We are at every turn committed or compromised without our full awareness or consent by the gamut of digital tools we either submit to or are otherwise subjected to. Tech companies routinely “go into areas where personal restraint would forbid us to intrude.” Trust is everywhere eroded and undermined. Those who urge responsibility and restraint are accused of being anti-technology romantics.
Near the end of this talk, Guardini acknowledged that “a kind of anxiety exists that leads to distinct distrust of active people,” but he nonetheless concluded that “we should not forget that those who take up practical tasks,” by which he meant engineers and technologists, “can indeed very easily ignore the problems. Or else they can have a belief in the power of progress, think that everything will come out right, and feel that they themselves are released from responsibility.” This, too, sounds very familiar.
I’ll give the last word to Guardini:
The fact that the machine brings a measure of freedom hitherto unknown is in the first instance a gain. The value of freedom, however, is not fixed solely by the question “Freedom from what?” but decisively by the further question “Freedom for what?”