The Acoustics of Place

There is a nice piece in the NY Times Magazine by Kim Tingley entitled, “Is Silence Going Extinct?”. Tingley traveled to Denali National Park in Alaska to write about the work of Davyd Betchkal who has been documenting the natural soundscape of the region. The essay focuses on the intrusion of human-made noise even into the remoteness of the Alaskan wilderness:

“since 2006, when scientists at Denali began a decade-long effort to collect a month’s worth of acoustic data from more than 60 sites across the park — including a 14,000-foot-high spot on Mount McKinley — Betchkal and his colleagues have recorded only 36 complete days in which the sounds of an internal combustion engine of some sort were absent.”

What struck me, however, were a few observations that bore upon the relationship of sound to our experience of place. This is first signaled when Tingley cites Prof. Bryan Pijanowski: “An even more critical task, he thinks, is alerting people to the way ‘soundscapes provide us with a sense of place’ and an emotional bond with the natural world that is unraveling.”

The human experience of place, because it is a necessarily embodied experience, includes more than our visual or kinesthetic experience. It is also constituted by our acoustic experience.

The following meditation on sound and place by Betchkal was especially insightful:

“Quiet is related to openness in the sense that the quieter it gets — as your listening area increases — your ability to hear reflections from farther away increases. The implication of that is that you get an immense sense of openness, of the landscape reflecting back to you, right? You can go out there, and you stand on a mountaintop, and it’s so quiet that you get this sense of space that’s unbelievable. The reflections are coming to you from afar. All of a sudden your perception is being affected by a larger area. Which is different from when you’re in your car. Why, when you’re in your car, do you feel like you are your car? It’s ’cause the car envelops you, it wraps you up in that sound of itself. Sound has everything to do with place.”

Tingley contributes this observation about the acoustic dimensions of being a body:

“Hearing arguably fixes us in time, space and our own bodies more than the other senses do. Our vitals are audible: sighing lungs, a pounding pulse, a burbling gut.”

Near the end, Tingley adds,

“The landscape enveloped me, as Betchkal said it would, and I felt I was the landscape, where mountains and glaciers rose and shifted eons before the first heartbeats came to life.

‘Standing in that place right there,’ Betchkal told me later, ‘I had a complete sense that I was standing in that place right there and not drawn or distracted from it at all.'”

This latter observation recalled Edward Casey’s admonition that we learn to take account of place: “… it is to our own peril that we do not [notice our experience of place]. For we risk falling prey to time’s patho-logic, according to which gaining is tantamount to losing.”

When time dominates our thinking, it is all about gaining and losing, a gaining that is a losing. Place calls forth the notion of being. We speak of gaining or losing time, but of being in place.

Courtesy of Wikimedia Commons

The Triumph of Time

From philosopher Edward S. Casey’s Getting Back Into Place:

“‘Time will tell’: so we say, and so we believe, in the modern era. This era, extending from Galileo and Descartes to Husserl and Heidegger, belongs to time, the time when Time came into its own. Finally. And yet, despite the fact that history, human and geological alike, occurs in time (and has been doing so for quite a while), time itself has not long been singled out for separate scrutiny …. By the middle of the eighteenth century, time had become prior to space in metaphysics as well as in physics. If, in Leibniz’s elegant formula, ‘space is the order of co-existing things,’ time proved to be more basic: as ‘the order of successive things,’ it alone encompasses the physical world order. By the moment when Kant could assert this, time had won primacy over space. We have been living off this legacy ever since, not in only philosophy and physics but in our daily lives as well.

These lives are grasped and ordered in terms of time. Scheduled and overscheduled, we look to the clock or the calendar for guidance and solace, even judgment! But such time-telling offers precious little guidance, no solace whatsoever, and a predominantly negative judgment (‘it’s too late now’) … We are lost because our conviction that time, not only the world’s time but our time, the only time we have, is always running down.”

Casey’s project may be described as a phenomenologically inflected recovery of place. But he begins by describing the triumph of time. Tell me whether this does not resonate deeply with your experience:

“The pandemic obsession with time from which so many moderns have suffered — and from which so many postmoderns still suffer — is exacerbated by the vertiginous sense that time and the world-order, together constituting the terra firma of modernist solidity, are subject to dissolution. Not surprisingly, we objectify time and pay handsome rewards … to those who can tie time down in improved chronometry. Although, the modern period has succeeded brilliantly in this very regard, it has also fallen into the schizoid state of having made objective, as clock-time and world-time, what is in fact most diaphanous and ephemeral, most ‘obscure’ in human experience. We end by obsessing about what is no object at all. We feel obligated to tell time in an objective manner; but in fact we have only obliged ourselves to do so by our own sub rosa subreptions, becoming thereby our own pawns in the losing game of time.”

The Internet & the Youth of Tomorrow: Highlights from the Pew Survey

The Pew Internet & American Life Project conducted “an opt-in, online survey of a diverse but non-random sample of 1,021 technology stakeholders and critics … between August 28 and October 31, 2011.” The survey presented two scenarios for the youth of 2020, asked participants to choose which they thought more likely, and then invited elaboration.

Here are the two scenarios and the responses they garnered:

Some 55% agreed with the statement:

In 2020 the brains of multitasking teens and young adults are “wired” differently from those over age 35 and overall it yields helpful results. They do not suffer notable cognitive shortcomings as they multitask and cycle quickly through personal- and work- related tasks. Rather, they are learning more and they are more adept at finding answers to deep questions, in part because they can search effectively and access collective intelligence via the internet. In sum, the changes in learning behavior and cognition among the young generally produce positive outcomes.

Some 42% agreed with the opposite statement, which posited:

In 2020, the brains of multitasking teens and young adults are “wired” differently from those over age 35 and overall it yields baleful results. They do not retain information; they spend most of their energy sharing short social messages, being entertained, and being distracted away from deep engagement with people and knowledge. They lack deep-thinking capabilities; they lack face-to-face social skills; they depend in unhealthy ways on the internet and mobile devices to function. In sum, the changes in behavior and cognition among the young are generally negative outcomes.

However the report also noted the following:

While 55% agreed with the statement that the future for the hyperconnected will generally be positive, many who chose that view noted that it is more their hope than their best guess, and a number of people said the true outcome will be a combination of both scenarios.

In all honesty, I am somewhat surprised the results split so evenly. I would have expected the more positive scenario to perform better than it did. The most interesting aspect of the report, however, are of course the excerpts presented from the respondents’ elaborations. Here are few with some interspersed commentary.

A number of respondents wrote about the skills that will be valued in the emerging information ecosystem:

  • There are concerns about new social divides. “I suspect we’re going to see an increased class division around labor and skills and attention,” said media scholar danah boyd.
  • “The essential skills will be those of rapidly searching, browsing, assessing quality, and synthesizing the vast quantities of information,” wrote Jonathan Grudin, principal researcher at Microsoft. “In contrast, the ability to read one thing and think hard about it for hours will not be of no consequence, but it will be of far less consequence for most people.”

Among the more interesting excerpts was this from Amber Case, cyberanthropologist and CEO of Geoloqi:

  • “The human brain is wired to adapt to what the environment around it requires for survival. Today and in the future it will not be as important to internalize information but to elastically be able to take multiple sources of information in, synthesize them, and make rapid decisions … Memories are becoming hyperlinks to information triggered by keywords and URLs. We are becoming ‘persistent paleontologists’ of our own external memories, as our brains are storing the keywords to get back to those memories and not the full memories themselves.”
I’m still not convinced at all by the argument against internalization. (You can read why here, here, and here.) But she is certainly correct about our “becoming ‘persistent paleontologists’ of our own external memory.” And the point was memorably put as well. We are building vast repositories of external memory and revisiting those stores in ways that are historically novel. We’ve yet to register the long term consequences.

The notion of our adaptability to new information environments was also raised frequently:

  • Cathy Cavanaugh, an associate professor of educational technology at the University of Florida, noted, “Throughout human history, human brains have elastically responded to changes in environments, society, and technology by ‘rewiring’ themselves. This is an evolutionary advantage and a way that human brains are suited to function.”

This may be true enough, but what is missing from these sorts of statements is any discussion of which environments might be better or worse for human beings. To acknowledge that we adapt is to say nothing about whether or not we ought to adapt. Or, if one insists, Borg-like, that we must adapt or die, there is little discussion about whether this adaptation leaves us on the whole better off. In other words, we ought to be asking whether the environment we are asked to adapt to is more or less conducive to human flourishing. If it is not, then all the talk of adaptation is a thinly veiled fatalism.

Some, however, did make strong and enthusiastic claims for the beneficence of the emerging media environment:

  • “The youth of 2020 will enjoy cognitive ability far beyond our estimates today based not only on their ability to embrace ADHD as a tool but also by their ability to share immediately any information with colleagues/friends and/or family, selectively and rapidly. Technology by 2020 will enable the youth to ignore political limitations, including country borders, and especially ignore time and distance as an inhibitor to communications. There will be heads-up displays in automobiles, electronic executive assistants, and cloud-based services they can access worldwide simply by walking near a portal and engaging with the required method such as an encrypted proximity reader (surely it will not be a keyboard). With or without devices on them, they will communicate with ease, waxing philosophic and joking in the same sentence. I have already seen youths of today between 20 and 35 who show all of these abilities, all driven by and/or enabled by the internet and the services/technologies that are collectively tied to and by it.”

This was one of the more techno-uptopian predictions in the survey. The notion of “embracing ADHD as a tool” is itself sufficiently jarring to catch one’s attention. One gets the gist of what the respondent is foreseeing — a society in which cognitive values have been radically re-ordered. Where sustained attention is no longer prized, attention deficit begins to seem like a boon. The claims about the irrelevance of geographic and temporal limits are particularly interesting (or disconcerting). They seemingly make a virtue of disembodied rootlessness. The youth of the future will, in this scenario, be temporally and spatially homeless, virtually dispersed. (The material environment of the future imagined here also invites comparison to the dystopian vision of the film Wall-E.)

Needless to say, not all respondents were nearly so sanguine. Most interestingly, many of the youngest respondents were among the most concerned:

  • A number of the survey respondents who are young people in the under-35 age group—the central focus of this research question—shared concerns about changes in human attention and depth of discourse among those who spend most or all of their waking hours under the influence of hyper connectivity.

This resonates with my experience teaching as well. There’s a palpable unease among many of the most connected with the pace, structure, and psychic consequences of the always on life. They appear to be discovering through experience what is eloquently put by Annette Liska:

  • Annette Liska, an emerging-technologies design expert, observed, “The idea that rapidity is a panacea for improved cognitive, behavioral, and social function is in direct conflict with topical movements that believe time serves as a critical ingredient in the ability to adapt, collaborate, create, gain perspective, and many other necessary (and desirable) qualities of life. Areas focusing on ‘sustainability’ make a strong case in point: slow food, traditional gardening, hands- on mechanical and artistic pursuits, environmental politics, those who eschew Facebook in favor of rich, active social networks in the ‘real’ world.”

One final excerpt:

  • Martin D. Owens, an attorney and author of Internet Gaming Law, Just as with J.R.R. Tolkien’s ring of power, the internet grants power to the individual according to that individual’s wisdom and moral stature. Idiots are free to do idiotic things with it; the wise are free to acquire more wisdom. It was ever thus.

In fact, the ring in Tolkien’s novels is a wholly corrupting force. The “wisdom and moral stature” of the wearer may only forestall the deleterious effects. The most wise avoided using it at all. I won’t go so far as to suggest that the same applies to the Internet, but I certainly couldn’t let Tolkien be appropriated in the service of misguided view of technological neutrality.

The Quantified Self

Stephen Wolfram, of Wolfram|Alpha fame, has collected a lot of data about himself. A lot.

In “Personal Analytics of My Life” he lays out some of what he has collected:

One day I’m sure everyone will routinely collect all sorts of data about themselves. But because I’ve been interested in data for a very long time, I started doing this long ago. I actually assumed lots of other people were doing it too, but apparently they were not. And so now I have what is probably one of the world’s largest collections of personal data.

Every day—in an effort at “self awareness”—I have automated systems send me a few emails about the day before. But even though I’ve been accumulating data for years—and always meant to analyze it—I’ve never actually gotten around to doing it. But withMathematica and the automated data analysis capabilities we just released inWolfram|Alpha Pro, I thought now would be a good time to finally try taking a look—and to use myself as an experimental subject for studying what one might call “personal analytics”.

What follows are various visual representations of data relating to the number of emails Wolfram has received and sent since 1989 (yes, that’s right, 1989), the number of key strokes he has struck since 2002, the number of events in his calendar, the phone calls he has made, and — since 2010 — the number of footsteps he has taken. Other graphs track the usage of certain words in his documents (including hard copies that he has assiduously kept and digitized) and file types that he has edited.

The visualization of the data is interesting to a point. At the expense of sounding a bit too cynical, however, I’m not sure that it revealed anything that wasn’t relatively banal. In his defense, Wolfram more or less acknowledges as much at points and concludes with a hopeful note:

“As personal analytics develops, it’s going to give us a whole new dimension to experiencing our lives. At first it all may seem quite nerdy (and certainly as I glance back at this blog post there’s a risk of that). But it won’t be long before it’s clear how incredibly useful it all is—and everyone will be doing it, and wondering how they could have ever gotten by before. And wishing they had started sooner, and hadn’t “lost” their earlier years.”

Obviously I’m a bit skeptical. But who knows, perhaps I lack imagination.

Wolfram is not the only one at work on the “quantified self.”

Gary Wolf and Kevin Kelly, along with others, have been working on a project to make the immense amount of data collected about you in a digital environment work for you.  According to Wolf, “[Y]our data is not for your boss, your teacher, or your doctor — it’s for you.” Most obvious applications are, of course, for health care.  Other potential applications?

  • Facial tracking to improve happiness.
  • Cognition tracking to evaluate effects of simple dietary changes on brain function.
  • Food composition tracking to determine ideal protein/carb meal ratios for athletic performance.
  • Concentration tracking to determine effects of coffee on productivity.
  • Proximity tracking to assist in evaluation of mood swings.
  • Mapping of asthma incidents and correlation with humidity, pollen count, and temperature.
  • Gas mileage tracking to figure out if driving style matters.

Again, interesting … potentially useful in some respects I’m sure. But somehow this “self-awareness” strikes me as something a little short of what the injunction “know thyself” calls for.

There is a line from a Wendell Berry poem that I am fond of quoting, “We live the given life, not the planned.” Perhaps it needs an update:  We live the given life, not the quantified.

Beware of Reporters Bearing the “Latest Studies”

Some while ago I posted a pretty funny take on the Science News Cycle courtesy of PHD Comics. Every now and then I link to it when posting about some “recent study” because it is helpful to remember the distortions that can take place as information works its way from the research lab to the evening news.

I was reminded of that comic again when I read this bit from a story in New Scientist:

“To find out if behaviour in a virtual world can translate to the physical world, Ahn randomly assigned 47 people either to inhabit a lumberjack avatar and cut down virtual trees with a chainsaw, or to simply imagine doing so while reading a story. Those who did the former used fewer napkins (five instead of six, on average) to clean up a spill 40 minutes later, showing that the task had made them more concerned about the environment.”

Surely something has been lost in translation from data to conclusion, no? The author notes that this is from an “unpublished” study which gives me renewed confidence in the peer review process.

Well, that is until I read a somewhat alarming post by neuroscientist Daniel Bor, “The dilemma of weak neuroimaging papers”, which contains this summation and query:

“Okay, so we’re stuck with a series of flawed publications, imperfect education about methods, and a culture that knows it can usually get away with sloppy stats or other tricks, in order to boost publications.  What can help solve some of these problems?”

All in all, we do well to proceed with healthy skepticism.