Hole In Our Hearts

Writing for Gizmodo, Matt Honan describes his experience at the massive Consumer Electronics Show in Las Vegas, and it reads like a passage from Augustine’s Confessions had Augustine been writing in the 21st rather than 5th century.

The quasi-religious overtones begin early on when Honan tells us, “There was ennui upon ennui upon ennui set in this amazing temple to technology.”

Then, a little further on, Honan writes:

“There is a hole in my heart dug deep by advertising and envy and a desire to see a thing that is new and different and beautiful. A place within me that is empty, and that I want to fill it up. The hole makes me think electronics can help. And of course, they can.

They make the world easier and more enjoyable. They boost productivity and provide entertainment and information and sometimes even status. At least for a while. At least until they are obsolete. At least until they are garbage.

Electronics are our talismans that ward off the spiritual vacuum of modernity; gilt in Gorilla Glass and cadmium. And in them we find entertainment in lieu of happiness, and exchanges in lieu of actual connections.

And, oh, I am guilty. I am guilty. I am guilty.

I feel that way too. More than most, probably. I’m forever wanting something new. Something I’ve never seen before, that no one else has. Something that will be both an extension and expression of my person. Something that will take me away from the world I actually live in and let me immerse myself in another. Something that will let me see more details, take better pictures, do more at once, work smarter, run faster, live longer.”

Here is the confession, the thrice-repeated mea culpa, alongside a truly Augustinian account of our disordered attachments and loves complete with a Pascalian nod to the diversionary nature of our engagement with technology.

I call this an Augustinian account not only because of the religiously inflected language and the confessional tone. There is also the explicit frame of an unfulfilling quest to fill a primordial emptiness. Augustine’s Confesssions amounts to a retrospective narrative of the spiritual quest which takes him from dissatisfaction to dissatisfaction until it culminates in his conversion. He famously frames his narrative at the outset when he prays, “You have made us for yourself, and our hearts are restless till they find their rest in you.” The restless heart knows its own emptiness and seeks, often heroically and tragically, to fill it. It loves and seeks to be loved, but it loves all the wrong things.

Pascal, writing in the shadow of Augustine’s influence, put it thus:

“What else does this craving, and this helplessness, proclaim but that there was once in man a true happiness, of which all that now remains is the empty print and trace? This he tries in vain to fill with everything around him, seeking in things that are not there the help he cannot find in those that are, though none can help, since this infinite abyss can be filled only with an infinite and immutable object; in other words by God himself.”

In a post titled “Making Holes In Our Hearts,” Kevin Kelly agrees to a point with Honan’s diagnosis, but his interpretation is quite different and also worth quoting at length. Here is Kelly:

If we are honest, we must admit that one aspect of the technium is to make holes in our heart. One day recently we decided that we cannot live another day unless we have a smart phone, when a dozen years earlier this need would have dumbfounded us. Now we get angry if the network is slow, but before, when we were innocent, we had no thoughts of the network at all. Now we crave the instant connection of friends, whereas before we were content with weekly, or daily, connections. But we keep inventing new things that make new desires, new longings, new wants, new holes that must be filled.

Yes, this is what technology does to us. Some people are furious that our hearts are pierced this way by the things we make. They see this ever-neediness as a debasement, a lowering of human nobility, the source of our continuous discontentment. I agree that it is the source. New technology forces us to be always chasing the new, which is always disappearing under the next new, a salvation always receding from our grasp.

But I celebrate the never-ending discontentment that the technium brings. Most of what we like about being human is invented. We are different from our animal ancestors in that we are not content to merely survive, but have been incredibly busy making up new itches which we have to scratch, digging extra holes that we have to fill, creating new desires we’ve never had before.

Kelly is on to something here. Discontentment is generative. Dissatisfaction can be productive. When Cain, having murdered his brother, is cursed to be forever a wanderer alienated from God and family, he builds a city in response. Here is an allegory to match Kelly’s observation. The perpetually wandering, alienated heart builds and makes and creates.

But does it follow that we should then celebrate discontentment, dissatisfaction, and unhappiness? I don’t see how. It is hard to cheer on misery, and it is a certain misery that we are talking about here. Perhaps the more appropriate response is the kind of plaintive admiration we reserve for the tragic hero. They may posses a real nobility, but it is finally consumed in despair and destruction.

The narrator of Cain’s story tells us that he built his city in the land called Nod, a name that echoes the Hebrew word for “wandering.” This touch of literary artistry poignantly suggests that even surrounded by the accouterments of civilization the human soul wanders lost and alienated – never satisfied, never home, never secure.

There is at least one other reason why we need not celebrate generative misery. Misery is not the only fount of human creativity. Love, wonderment, compassion, kindness, curiosity, beauty — all of these might also set us to work and marvelously so.

Augustine understood that finding rest for his restless heart in the love of God did not necessarily extinguish all other loves. It merely reordered them. Having found the kind of satisfaction and happiness that our stuff (for lack of a more inclusive word) can never bring does not mean that we can never again create or enjoy the fruits of human creativity. In fact, it likely means that we may create and enjoy more fully because such creation and enjoyment will not be burden with the unbearable weight of filling the primordial vacuum of the human heart.

The simplest and only way to enjoy penultimate and impermanent things is to resist the temptation to invest them with the significance and adoration that only ultimate and permanent things can sustain.

Saint Augustine by Phillippe de Champaigne, c. 1645

Another Controversial “We Can, But Ought We?” Issue

A few days ago I posted links to two pieces that raised the question of whether we ought do what new technologies may soon allow us to do. Both of those case, interestingly, revolved around matters of time. One explored the possibility of erasing memories and the other discussed the possibility of harnessing the predictive power of processing massive amounts of data. The former touched on our relationship to the past, the latter on our relationship to the future.

A recent article in Canada’s The Globe and Mail tackles the thorny questions surrounding genetic screening and selection. Here’s the set up:

Just as Paracelsus wrote that his recipe worked best if done in secret, modern science is quietly handing humanity something the quirky Renaissance scholar could only imagine: the capacity to harness our own evolution. We now have the potential to banish the genes that kill us, that make us susceptible to cancer, heart disease, depression, addictions and obesity, and to select those that may make us healthier, stronger, more intelligent.

The question is, should we?

That is the question. As you might imagine, there are vocal advocates and critics. I’ve steered clear of directly addressing issues related to reproductive technologies because I am not trained as a bioethicist. But these are critical issues and the vast majority of those who will make real-world decisions related to the possibilities opened up by new bio-genetic technologies will have no formal training in bioethics either. It’s best we all start thinking about these questions.

According to one of the doctors cited in the article, “We are not going to slow the technology, so the question is, how do we use it?” He added, “Twenty years from now, you have to wonder if all babies will be conceived by IVF.”

Of course, it is worth considering what is assumed when someone claims that we are not going to slow down the technology. But as is usually the case, the technical capabilities are outstripping the ethical debate — at least, the sort of public debate that could theoretically shape the social consensus.

About three paragraphs into the article, I began thinking about the film Gattaca which I’ve long considered an excellent reflection on the kinds of issues involved in genetic screening. Two or three more paragraphs into the article, then, I was pleased to see the film mentioned. I’ve suggested before (here and here) that stories are often our best vehicles for ethical reflection, they make abstract arguments concrete. For example, one of the best critiques of utilitarianism that I have read was Ursula le Guin’s short story, “Those Who Walk Away From Omelas.” Gattaca is a story that serves a similar function in relation to genetic screening and selection.  I’d recommend checking the movie out if you haven’t seen it.

Knowing Without Understanding: Our Crisis of Meaning?

In the off chance that David Weinberger had stumbled upon my slightly snarky remarks on an interview he gave at Salon, he might very well have been justified in concluding that I missed his point. In my defense, that point didn’t exactly get across in the interview which, taken alone, is still decidedly underwhelming. But the excerpt from Weinberger’s book that I subsequently came across at The Atlantic did address the very point I took issue with in my initial post — the question of meaning.

In the interview, Weinberger commented on the inadequacy of a view of knowledge that conceived of the work of knowing as a series of successively reductive steps moving from data to information to understanding and finally to wisdom. As he described it, the progression was understood as a process of filtering the useful from superfluous, or finding the proverbial needle in a haystack. Earlier in the interview he had referred to our current filter problem, i.e., our filters in the digital age do not filter anything out.

At the time this reminded me of Nicholas Carr’s comments some while ago on Clay Shirky’s approach to the problem of information overload. Shirky argued that our problem is not information overload but, rather, filter failure. We just haven’t developed adequate filters for the new digital information environment. Carr, rightly I believe, argued that our problem is not that our filters are inadequate, it’s that they are too good. We don’t have a needle in a haystack problem, we have a stack of needles.

This is analogous to the point Weinberger makes about filters: “Digital filters don’t remove anything; they only reduce the number of clicks that it takes to get to something.”

All of this comes across more coherently and compellingly in the book excerpt. There Weinberger deals directly with the significance of the book’s title, Too Big to Know. Science now operates with data sets that do not yield to human understanding. Computers are able to process the data and produce workable models that make the data useful, but we don’t necessarily understand why. We have models, but not theories; hence the title of the excerpt in The Atlantic, “To Know, but Not Understand.”

Returning to my initial post, I criticized the manner in which Weinberger framed the movement from information to wisdom on the grounds that it took no account of meaning. In my estimation, moving from information to wisdom was not merely a matter of reducing a sea of information to a manageable set of knowledge that can be applied; it was also a matter of deriving meaning from the information and the construction of meaning cannot be abstract from individual human beings and their lived experience.

Now, let me provide Weinberger’s rejoinder for him: The question of creating meaning at the level of the individual is moot. When dealing with the amount of data under consideration, meaning is no longer an option. We are in the position of being able to act without understanding; we can do what we cannot understand.

The problem, if indeed we can agree that it is a problem, of doing without understanding is not a unique consequence of the digital age and the power of supercomputers to amass and crunch immense amounts of data. As I wrote in the first of a still unfinished triptych of posts, Hannah Arendt expressed similar concerns over half a century ago:

Writing near the midpoint of the last century, Hannah Arendt worried that we were losing the ability “to think and speak about the things which nevertheless we are able to do.” The advances of science were such that representing what we knew about the world could be done only in the language of mathematics, and efforts to represent this knowledge in a publicly meaningful and accessible manner would become increasingly difficult, if not altogether impossible.  Under such circumstances speech and thought would part company and political life, premised as it is on the possibility of meaningful speech, would be undone.  Consequently, “it would be as though our brain, which constitutes the physical, material condition of our thoughts, were unable to follow what we do, so that from now on we would indeed need artificial machines to do our thinking and speaking.”

But the situation was not identical. In Arendt’s scenario, there were still a privileged few that could meaningfully understand the science. It was a problem posed by complexity and some few brilliant minds could still grasp what the vast majority of us could not. What Weinberger describes is a situation in which no human mind is able to meaningfully understand the phenomenon. It is a matter of complexity, yes, but it is irredeemably aggravated by the magnitude and scale of the data involved.

I’ve long thought that many of our discontents stem from analogous problems of scale. Joseph Stalin allegedly claimed that, “The death of one man is a tragedy, the death of millions is a statistic.” Whether Stalin in fact said this or not, the line captures a certain truth. There is a threshold of scale at which we pass from that which we can meaningfully comprehend to that which blurs into the undistinguishable. The gap, for example, between public debt of $1 trillion and $3 trillion dollars is immense, but I suspect that for most of us the difference at this scale is no longer meaningful. It might as well be $20 trillion, it is all the same, which is to say it is equally unfathomable. And this is to say nothing of the byzantine quality of the computerized global financial industry as a whole.

In a recent post about the opacity of the banking system, Steve Waldman concluded as follows:

“This is the business of banking. Opacity is not something that can be reformed away, because it is essential to banks’ economic function of mobilizing the risk-bearing capacity of people who, if fully informed, wouldn’t bear the risk. Societies that lack opaque, faintly fraudulent, financial systems fail to develop and prosper. Insufficient economic risks are taken to sustain growth and development. You can have opacity and an industrial economy, or you can have transparency and herd goats . . . .

Nick Rowe memorably described finance as magic. The analogy I would choose is finance as placebo. Financial systems are sugar pills by which we collectively embolden ourselves to bear economic risk. As with any good placebo, we must never understand that it is just a bit of sugar. We must believe the concoction we are taking to be the product of brilliant science, the details of which we could never understand. The financial placebo peddlers make it so.”

In a brief update, Waldman added,

“I have presented an overly flattering case for the status quo here. The (real!) benefits to opacity that I’ve described must be weighed against the profound, even apocalyptic social costs that obtain when the placebo fails, especially given the likelihood that placebo peddlars will continue their con long after good opportunities for investment at scale have been exhausted.”

Needless to say, this is a less than comforting set of circumstances. Yet, “apocalyptic social costs” are not my main concern at the moment. Rather it is what we might, perhaps hyperbolically, call the apocalyptic psychic costs incurred by living in a time during which substantial swaths of experience are rendered unintelligible.

I appreciate Waldman’s placebo analogy, it gets at an important dimension of the situation, but Rowe’s analogy to magic is worth retaining. If you’ve been reading here for awhile, you’ll remember a handful of posts along the way that draw an analogy between magic and technology. It is an observation registered by C. S. Lewis, Lewis Mumford, and Jacque Ellul among others, and it is considered at book length by Richard Stivers. The analogy is taken in various directions, but what strikes me is the manner in which it troubles our historical narratives.

We often think of the whole of the pre-modern era as an age dominated by magical, which is to say unscientific thinking. Beginning with the Renaissance and continuing through the complex historical developments we gloss as the Scientific Revolution and the Enlightenment, we escape the realm of magic into the arena of science. And yet, it would seem that at the far end of that trajectory, taking it uncritically at face value, there is a reversion to magical thinking. It is, of course, true that there is a substantial difference — our magic “works.” But at the phenomenological level, the difference may be inconsequential. Our thinking may not be magical in the same way, but much of our doing proceeds as if by magic, without our understanding.

I suspect this is initially wondrous and enchanting, but over time it is finally unsettling and alienating.

We are all of us, even the brightest among us, embedded in systems we understand vaguely and partially at best. Certain few individuals understand certain few aspects of the whole, but no one understands the whole. And, it would seem, that the more we are able to know the less we are capable of understanding. Consider the much discussed essay by MIT physicist Alan Lightman in Harper’s. Take time to read the whole, but here is the conclusion:

“That same uncertainty disturbs many physicists who are adjusting to the idea of the multiverse. Not only must we accept that basic properties of our universe are accidental and uncalculable. In addition, we must believe in the existence of many other universes. But we have no conceivable way of observing these other universes and cannot prove their existence. Thus, to explain what we see in the world and in our mental deductions, we must believe in what we cannot prove.

Sound familiar? Theologians are accustomed to taking some beliefs on faith. Scientists are not. All we can do is hope that the same theories that predict the multiverse also produce many other predictions that we can test here in our own universe. But the other universes themselves will almost certainly remain a conjecture.

‘We had a lot more confidence in our intuition before the discovery of dark energy and the multiverse idea,’ says Guth. There will still be a lot for us to understand, but we will miss out on the fun of figuring everything out from first principles.'”

Faith? “All we can do is hope…”?

We have traveled from magic to magic and from faith to faith through an interval of understanding. Of course, it is possible to conclude that we’ve always failed to understand, it is just that now we know we don’t understand. Having banked heavily on a specific type of understanding and the mastery it could yield, we appear now to have come up at the far end against barriers to understanding and meaningful action. And in this sense we may be, from a certain perspective, worse off. The acknowledged limits of our knowing and understanding in premodern setting took their place within a context of intelligibility; the lack of understanding was itself rendered meaningful within the larger metaphysical picture of reality. The unfolding awareness of the limits of our knowing takes place within a context in which intelligibility was staked on knowing and understanding, in which there was no metaphysical space for mystery as it were. They acted meaningfully in the context of what appears from our perspective as a deep ignorance; it now seems that we are consigned to act without meaning in the context of a surfeit of knowledge.

I’m tempted to conclude by suggesting that the last metaphysical shock to arise from the Promethean enterprise may then be the startled recognition of the Hephaestean chains. But that may be too glib and these are serious matters. Too serious, certainly, to tie up neatly at the end of an off the cuff morning blog post. This was all, as they say, shot from the hip. I welcome any thoughts you might have on any of this.

Can We? Ought We?

Just because it can be done, it does not follow that it ought to be done.

This commonplace strikes me as generally reasonable and perhaps platitudinously so. So, for example, just because you can ram your car into your garage door, it doesn’t follow that you should. In ethical debates with a philosophical orientation one often hears the claim, first articulated by Hume, that you can’t get ought from is. In this case, we might say that you can’t get ought from can.

When the ought is generally established or commonsensical, as in the example above, then there is little to talk about. But there are cases when matters are not nearly as obvious. The principle is often cited in connection with new technologies and it is often articulated by those who believe that the mere ability to achieve some specified end, say human cloning, through scientific knowledge and technical manipulation tells us nothing about whether or not such an end ought to be pursued.

Two very recent articles raise the question of the ought-ness of a capability that may be on the horizon.

The first, “Should We Erase Painful Memories,” is an excerpt from Alison Winters new book, Memory: Fragments of a Modern History. It discusses the possibility of memory dampening or therapeutic forgetting, basically erasing certain memories a la The Eternal Sunshine of the Spotless Mind.

The second, “The Future of Prediction,” discusses the possibilities opened up for more accurate forecasting by the emerging ability to crunch immense amounts of data. (Unfortunately, you’ll have to endure the Boston Globe’s atrocious formatting to read this article.)

In both cases, an ability to achieve a particular end is in view and it is not at all obvious whether the end is unproblematically desirable or not. Enjoy thinking through these issues. At the moment it is an interesting, speculative debate. In the not so distant future, it may be a concrete decision.

For an interesting model of how to go about thinking about these issues, you may want to consider reading Leon Kass’ “Ageless Bodies, Happy Souls” in which he tackles a similar questions with regards to biotechnological enhancements.

A Chance to Find Yourself

At The American Scholar you can read William Deresiewicz’s lecture to the plebe class of 2009 at West Point. The lecture is titled “Solitude and Leadership” and it makes an eloquent case for the necessity of solitude, and solitary reading in particular, to the would-be leader.

Throughout the lecture, Deresiewicz draws on Joseph Conrad’s The Heart of Darkness and near the end of his talk he cites the following passage. Speaking of an assistant to the manager of the Central Station, Marlow observes:

“I let him run on, this papier-mâché Mephistopheles and it seemed to me that if I tried I could poke my forefinger through him, and would find nothing inside but a little loose dirt. . . .

It was a great comfort to turn from that chap to . . . the battered, twisted, ruined, tin-pot steamboat. . . . I had expended enough hard work on her to make me love her. No influential friend would have served me better. She had given me a chance to come out a bit—to find out what I could do. No, I don’t like work. I had rather laze about and think of all the fine things that can be done. I don’t like work—no man does—but I like what is in the work,—the chance to find yourself. Your own reality—for yourself, not for others—what no other man can ever know.”

Much to think about in those few short lines. “Papier-mâché Mephistopheles” — what a remarkably apt image for what Arendt would later call the banality of evil. It is also worth reflecting on Conrad’s estimation of work in this passage. He evocatively captures part of what I’ve tried to describe in my posts on the discontents of the frictionless life and disposable reality.

It was, however, the line “for yourself, not for others” that struck me with peculiar force. I’ve written here before about the problems with solipsistic or misanthropic individualism. And it should go without saying that, in some important sense, we certainly ought to think and act for others. But I don’t think this is the sort of thing that Conrad had in mind. Perhaps he was driving at some proto-existenialist pour soi. In any case, what came to my mind was the manner in which a life mediated by social media and smart phones is lived “for others”.

Let me try to clarify. The mediated variety of being “for others” is a form of performance and presentation. What we are doing is constructing and offering an image of ourselves for others to consume. The pictures we post, the items we Like, the tweets we retweet, the status updates, the locations we announce on Foursquare, the music we stream, and dare I say it, the blog posts we write — all of these are “for others” and, at least potentially, “for others” without real regard for them. Others, in the worst forms of this dynamic, are merely an audience that can reflect back to us and reinforce our performance of ourselves. In being “for others” in this sense, we risk being “for ourselves” in the worst way.

There is another, less problematic way of being “for others”. At the risk of oversimplifying, let’s call this an unmediated way of being “for others”. This mode of being for others is not self-consciously focused on performance and presentation. This way of being for others does not reduce others to the status of mirrors reflecting our own image back to us. Other are in this case an end, not a means. We lose ourselves in being for others in this way. We do not offer ourselves for consumption, but we are consumed in the work of being for others. The paradox here is that those who are able to lose themselves in this way tend to have a substantial and steady sense of self. Perhaps because they have been “for themselves” in Conrad’s sense, they have nurtured their convictions and character in solitude so that they can be for other in themselves, that is “for others” for the sake of others.

Those who are for others only by way of being for themselves finally end up resembling Conrad’s papier-mâché Mephistopheles, we could poke our fingers through them and find nothing but a little dirt. All is surface.

Altogether, we might conclude that there is an important difference between being for other for the sake of being for yourself and being for yourself for the sake of being for others.

The truth, of course, is that these modes of being “for others” are not new and the former certainly does not owe its existence uniquely to social media. The performed self has roots in the emergence of modernity and this mode of being for others has a family resemblance to flattery which has an even older pedigree. But ubiquitous connectivity and social media do the work of amplifying and generalizing the condition. When their use becomes habitual, when we begin to see the world as potential material for social media, then the space to be for ourselves/by ourselves collapses and we find that we are always being for others for our own sake, preoccupied with the presentation of surfaces.

The consequences of this mode of being are good neither for us, nor for others.