Virtual Communities, Anonymity, and the Risks of Self-Disclosure

In the beginning Cold War-era engineers created the Internet and then utopian, hippy enthusiasts rescued the Internet and gave it to the world. Regrettably, spammers, venture capitalists, marketers, and corporations entered into the garden of digital delights and a communitarian paradise devolved into a virtual mall.

This, more or less, is the storyline of Evgeny Morozov’s very brief history of the Internet, “Two Decades of the Web: A Utopia No More” in Prospect. The author of The Net Delusion: The Dark Side of Internet Freedom, in case it wasn’t clear from the two titles, is not exactly optimistic about the direction the Internet has taken. The plot of his short overview of Internet history moves from early promise to eventual decline; it is a tale of utopian hopes disciplined by unfortunate realities.

Morozov’s history takes the idea of “virtual community” as its theme and features a set of Internet “cheerleaders” – Stuart Brand, Kevin Kelly, John Perry Barlow, and the Wired crowd – as its tragic protagonists. Tragic because their high-minded, lofty ideals were undercut, in Morozov’s telling, by an accompanying naiveté that left Paradise unguarded against the corporate snakes. “These men,” according to Morozov, “emphasised the importance of community and shared experiences; they viewed humans as essentially good, influenced by rational deliberation, and tending towards co-operation. Anti-Hobbesian at heart, they viewed the state and its institutions as an obstacle to be overcome—and what better way to transcend them than via cyberspace?”

This anarchist/libertarian proclivity exposed the online community to dangers trivial and grave:

Perhaps the mismatch between digital ideals and reality can be ascribed to the naivety of the technology pundits. But the real problem was that the internet’s early visionaries never translated their aspirations for a shared cyberspace into a set of concrete principles on which online regulation could be constructed. It’s as if they wanted to build an exemplary city on the hill, but never bothered to spell out how to keep it exemplary once it started growing.

The law of entropy took over from there. The allusion to a city on a hill recalls the Puritan experiment in pious self-government that never quite managed to pass on its vision to the next generation. Pursuing the analogy, Morozov fills the role of the preachers who evolved the jeremiad – a genre of sermon identified by historian Perry Miller that denounced the community’s departure from its founding principles and called for repentance and renewal. Morozov’s commentary fits the genre neatly, and that is not at all to detract from the value of his critique.

The connection to the Puritans is worth pursuing even further, but we have to see beyond the typical tropes with which they are associated, those that lend color to the term puritanical, for just a moment. The Puritans loom large in most tellings of early American history and their influence has long been both celebrated and lamented. In one respect, though, they are out of step with the subsequent evolution of American culture. The rugged individualism that came to dominate the American psyche would have been an unwelcome anomaly within the deeply communitarian ethos of the early Puritan settlers. John Winthrop’s sermon aboard the Arbella in which we find the “city on a hill” imagery is fundamentally a communitarian tract urging self-restraint, self-sacrifice, mercy, justice, and generosity for the good of the community. Shared sacrifice, shared risk, shared resources, shared lives – out of such was community forged.

With this in mind we can begin to sound out a tension within Morozov’s account. In his view, personalization and the loss of privacy are among the chief temptations that ultimately lead to the fall of the digital community. So for example, he notes with displeasure: “The logical end of this ever-increasing personalisation is of each user having his or her own online experience. This is a far cry from the early vision of the internet as a communal space.” And further on we read that, “For many internet users, empowerment was an illusion. They may think they enjoy free access to cool services, but in reality, they are paying for that access with their privacy.”

Yet, certain constructions of privacy and an aversion to personalization sit uneasily alongside of a communitarian ideal. Pressed to their extremes privacy and depersonalization converge in anonymity, and it was, in fact, the anonymity of the early Internet that thrilled theorist with the possibilities for experimentation with identity and its construction. Unfortunately, the early digital communitarian ideal was tied to this vision of privacy/anonymity, and you are not likely to have anything like a community in any strong sense on those terms, at least not in a way that answers to the human social impulse. Social media has provided something like an experience of community precisely because it has been tied to personalization (all of its attendant problems notwithstanding).

Maybe the real problem with the early internet was its commitment to anonymity. Personalization, not regulation, may have curbed the sorts of behaviors and developments that Morozov laments. The unbridled pursuit of self-interest which is always the enemy of community, virtual or otherwise, is abetted by the lack of accountability engendered by anonymity. This is a lesson at least as old as the Ring of Gyges story told by Plato. Community, and the fully human life it enables, is on the other hand built upon the risk of self-disclosure. A point not lost on Hannah Arendt when she noted that, “To live an entirely private life means above all to be deprived of things essential to a truly human life: to be deprived of the reality that comes from being seen and heard by others.” And, she later adds, “Action without a name, a ‘who’ attached to it, is meaningless …”

Admittedly, community is an amorphous and abstract concept and the pursuit of its virtual analogue may be finally incoherent. What’s more, anonymity is in certain circumstances clearly desirable, and commodified personalization is undoubtedly problematic. Finally, there are clearly important and necessary forms of privacy which must be protected. With these qualifications noted, it remains the case that any kind of online community, if it is to serve as a mediating structure that allows for social interaction on a scale somewhere between the anonymity of total seclusion and that of mass society, needs to be built on some degree of measured self-disclosure and consistency of identity.

This entails all sorts of risks and even sacrifices which means that what we may need is less of a jeremiad and more Winthrop-esque vision setting.

_________________________________________________________

Article first published as Virtual Communities, Anonymity, and the Risks of Self-Disclosure on Blogcritics.

The Triumph of the Social

“Social” is big, in case you missed it.

The most noticeable and  significant development in our media environment over the last decade or so has been the emergence of social elements across digital media platforms and the subsequent migration of social media into traditional media fields.  We might, to borrow a phrase, call this the triumph of the social.   Whatever we call it, this development marks a significant departure from the trend toward individualism that characterized the modern era.

Modernity, according to the standard storyline, was characterized by the individual’s liberation from the constraints of place, tradition, institutions, and, to some degree, biology.  The Renaissance, the Reformation, the Enlightenment — each is an episode in the rise of the individual.  Protestantism, democracy, capitalism — each features the individual and his soul, his rights, his property prominently.  From this angle, postmodernity is, in fact, hyper-modernity; it is not a break with the trajectory of modernity with respect to identity, it is its consummation.   The individual is liberated even from the notion of the persistent or essential self.  Identity, according to the usual suspect theorists, is constructed all the way down (except, of course, that it is not).

I can reasonably follow this story through to the early Internet age, but then something changes. Social media reasserts the social self.  This could be read as a further development of the individualist trajectory – the liberated individual is simply given a larger stage from which to pursue the project of creating their identity free of constraints.  If so, then what we might analyze is the new mode of identity construction.  For example, we might note that we now perform our identity by sharing.  The “Like” button becomes the instrument of identity construction.  The bands, television shows, websites, products, news stories, movies, clothes, cars, companies, causes, etc. that we “Like” signal to our social network who we “are”.

Whatever truth there may be too that, and there is some, there is one other way to read the rise of the social.  In a recent blog post, sociologist Peter Berger offered some reflections on the July 2011 issue of The Annals of the American Academy of Political and Social Science devoted to the topic of “Patrimonial Power in the Modern World.”  In Berger’s summary, patrimonial power is “is power on the basis of kinship and other patron-client relationships. It is the most common form of political authority in traditional societies before the rise of centralized states and empires. Such authority is exercised by way of personal loyalties rather than formal rules. The tribal chief is the prototypical leader in patrimonial regimes.”  Patrimonial power functioned within traditional societies grounded in personal ties and relationships.

Modernity, on the other hand, is characterized by other forms of power and authority.  Berger continues:

The counter-type is the bureaucrat … In a patrimonial system one trusts the chief because he belongs to one’s tribe and embodies its tradition. In what Weber called a “legal-rational” system one trusts the bureaucrat because he occupies an office established by proper procedures; indeed one trusts these procedures rather than the particular individual they have placed in the office.

Bureaucracy, however, not only abstracts power, it abstracts the personal.  In a bureaucracy the person is reduced to a number or an algorithm.  So one of the ironies of modernity is that the rise of the individual was accompanied by the rise of institutions that de-faced the individual.

What’s more, the rise of the individual throughout the modern period was coupled with the simultaneous rise of modern notions of privacy.  The extreme end of the privacy spectrum is complete anonymity, and here too the individual is de-faced, left without personal connection.

Just to be clear, this move toward individualism and privacy was not all bad.  In many respects it was a healthy corrective.  But the pendulum may have swung too far, and Berger does a nice job of explaining where the problem lies:

Robert Musil, in his great novel The Man Without Qualities, recounts an incident when Ulrich, the central character, is arrested and processed in a police station. He experiences what he calls a “statistical disaggregation” of his person. He is reduced to the minimal characteristics relevant to the police investigation, while all the characteristics essential to his self-esteem are ignored. In one way or another, we experience something like this depersonalization in many situations. We are abstract objects of the juridical system in court, abstract patients in a hospital,  abstract consumers in the marketplace. Everything we cherish most about ourselves is strictly irrelevant—our intellectual achievements, our sense of humor or capacity for affection, not to mention the prerogatives of age. In such situations we instinctively reach out for “tribal” connections—for someone who knows who we are, with whom we share an ethnic or religious identity, or even someone who laughs at a joke we tell: in sum, someone who recognizes us in a personal way.

Berger goes on to recall the term he coined with Richard Neuhaus, “mediating structures,” to describe neighborhood, family, church and voluntary associations.  These mediating structures buffered the individual from the impersonal, bureaucratic power of the state, but these structures have themselves been severely compromised leaving the individual isolated and disconnected.  This state of affairs, along with the the presumption that we are indeed political, which is to say social, animals helps explain, in part at least, the triumph of the social.  Social media functions as a mediating structure, a realm in which we are addressed by name and find our individual self publicly acknowledged.

This is not the whole story, of course. Social media in its own way also reduces us to numbers or algorithms, and it cannot provide the all that traditional mediating structures, at their best, are able to offer.  There are also temptations to narcissism and worse.  But, the risks notwithstanding, social media owes its success to the way it addresses a fundamental dimension of being human.

______________________________________________________

Related post:  The (Un)Naturalness of Privacy.

Place and Image, Death and Memory

The sociability of social networking sites such as Facebook is built upon an archive of memories.  Facebook trades in memory in at least two ways.  On the one hand, and perhaps especially for older users, Facebook is platform that facilitates the search for memories.  Old friends and old flames can be found on Facebook.  Reconnecting with a high school buddy activates a surge of interconnected memories that lead to other long forgotten memories and so on.  On the other hand, and this perhaps especially for younger users, Facebook also renders present experience already a depository of potential memories.  The future past impinges upon the present.  Our experience is a conducted as a search for memories yet to be formed which will be archived on Facebook.  In a sense then we hunt for memories past and, paradoxically, for memories future.

This post is the second in a series situating Facebook within the memory theater/arts of memory tradition.  In the first post I set the stage by describing how social networking sites and internet enabled smart phones have constituted experience as a field of potential memories.  I also suggested that how we store and access our memories makes a difference.  The cultural and personal significance of memory is not a static category of human nature. Memory and its significance evolve over time, often in response to changing technologies.  So the question, then, is something like this:  What difference does it make, personally and culturally, that Facebook has become such a prominent mode of memory? 

In order to explore that question, I’ll delve briefly into the history of the art of memory, a set of memory practices with which, I believe, Facebook shares interesting similarities.  But as promised at the end of the last post, we’ll start with a story.

Spatiality, images, and death have long been woven together in the complex history of remembering.  Each appears prominently in the founding myth of what Frances Yates has called the “art of memory” as recounted by Cicero in his De oratore. According to the story, the poet Simonides of Ceos was contracted by Scopas, a Thessalian nobleman, to compose a poem in his honor.  To the nobleman’s chagrin, Simonides devoted half of his oration to the praise of the gods Castor and Pollux.  Feeling himself cheated out of half of the honor, Scopas brusquely paid Simonides only half the agreed upon fee and told him to seek the rest from the twin gods.  Not long afterward that same evening, Simonides was summoned from the banqueting table by news that two young men were calling for him at the door.  Simonides sought the two callers, but found no one.  While he was out of the house, however, the roof caved in killing all of those gathered around the table including Scopas. As Yates puts it, “The invisible callers, Castor and Pollux, had handsomely paid for their share in the panegyric by drawing Simonides away from the banquet just before the crash.”

Cicero

The bodies of the victims were so disfigured by the manner of death that they were rendered unidentifiable even by family and friends.  Simonides, however, found that he was able to recall where each person was seated around the table and in this way he identified each body.  This led Simonides to the realization that place and image were the keys to memory, and in this case, also a means of preserving identity through the calamity of death.  In Cicero’s words,

[Simonides] inferred that persons desiring to train [their memory] must select places and form mental images of the things they wish to remember and store those images in the places, so that the order of the places will preserve the order of the things, and the images of the things will denote the things themselves, and we shall employ the places and images respectively as a wax writing-tablet and the letters written on it.

Cicero is one of three classical sources on the principles of artificial memory that evolved in the ancient world as a component of rhetorical training.  The other two sources are Quintilian’s Institutio oratoria and the anonymous Ad Herennium.  It is through the Ad Herennium, mistakenly attributed to Cicero, that the art of memory migrates into Medieval culture where it is eventually assimilated into the field of ethics.  Cicero’s allusion to the wax-writing table, however, reminds us that discussion of memory in the ancient world was not limited to the rhetorical schools.  Memory as a block of wax upon which we make impressions is a metaphor attributed to Socrates in Plato’s Theaetetus where it appears as a gift of Mnemosyne, the mother of the muses:

Imagine, then, for the sake of argument, that our minds contain a block of wax, which in this or that individual may be larger or smaller, and composed of wax that is comparatively pure or muddy, and harder in some, softer in others, and sometimes of just the right consistency.

Let us call it the gift of the Muses’ mother, Memory, and say that whenever we wish to remember something we see or hear or conceive in our own minds, we hold this wax under the perceptions or ideas and imprint them on it as we might stamp the impressions of a seal ring.  Whatever is so imprinted we remember and know so long as the image remains; whatever is rubbed out or has not succeeded in leaving an impression we have forgotten and do not know.

Plato and Aristotle in Rafeal's "School of Athens"

The Platonic understanding of memory was grounded in a metaphysic and epistemology which located the ability to apprehend truth in an act of recollection.  Plato believed that the highest forms of knowledge were not derived from sense experience, but were first apprehended by the soul in a pre-existent state and remain imprinted deep in a person’s memory.  Truth consists in matching the sensible experience of physical reality to the imprint of eternal Forms or Ideas whose images or imprints reside in memory.  Consequently the chief aim of education is the remembering of these Ideas and this aim is principally attained through “dialectical enquiry,” a process, modeled by Plato’s dialogs, by which a student may arrive at a remembering of the Ideas.

At this point, we should notice that the anteriority, or “pastness,” of the knowledge in question is, strictly speaking, incidental.  What is important is the presence of the absent Idea or Form.  It is to evoke the presence of this absence that remembering is deployed.  It is the presence of eternal Ideas that secures the apprehension of truth, goodness, or beauty in the present.  Locating the memory within the span of time past does not bear upon its value which rests in its being possessed as a model against which to measure experience.

Paul Ricoeur, in Memory, History, Forgetting, begins his consideration of the heritage of Greek reflections on memory with the following observation:

Socratic philosophy bequeathed to us two rival and complementary topoi on this subject, one Platonic, the other Aristotelian.  The first, centered on the theme of the eikōn [image], speaks of the present representation of an absent thing; it argues implicitly for enclosing the problematic of memory within that of imagination.  The second, centered on the theme of the representation of a thing formerly perceived, acquired, or learned, argues for including the problematic of the image within that of remembering.

As he goes on to note, from these two framings of the problematic of memory “we can never completely extricate ourselves.”

Reflecting for just a moment on the nature of our own memories it is not difficult to see why this might be the case.  If we remember our mother, for example, we may do so either by contemplating some idealized image of her in our mind’s eye or else by recollecting a moment from our shared past.  In both cases we may be said to be remembering our mother, but the memories differ along the Platonic/Aristotelian divide suggested by Ricoeur.  In the former case I remember her in a way that seeks her presence without reference to time past; in the latter, I remember her in a way that situates her chronologically in the past.

At this point, I’m sure it seems that we’ve wandered a bit from the art of memory and father still from social networking sites.  There is a method to this madness, however, but demonstrating that will have to wait for the next post.  Already, I am pushing the limits of acceptable blog post length.

Looking forward to the next post in this series, then, here are the tasks that remain:

  • Exploring memory as an index of desire.
  • Setting the art of memory tradition, and Facebook, within Ricoeur’s schema.
  • Asking what difference all of this makes.

Social Media and the Arts of Memory

UPDATE: A complete and updated form of the essay began here is now available here.

Early in The Art of Memory, Frances Yates pauses to envision a “forgotten social habit” of the classical world.  She invites us to wonder,“Who is that man moving slowly in the lonely building, stopping at intervals with an intent face?”  That man, Yates tells us, is a “rhetoric student forming a set of memory loci.”  The rhetoric student would have been creating the architecture of a mental space into which they would then place vivid images imaginatively designed to recollect the themes or words of a prepared oration.  While delivering the oration, the rhetor would navigate the mental space coming upon each carefully placed image which triggered their memory accordingly.  This work of designing mental space and populating the space with striking images followed the prescriptions of the artificial memory techniques widely practiced in classical antiquity.

What if, however, we updated Yates’ scene by setting it in the present?  The scene would be instantly recognizable as long as we equipped our imagined person with a cell phone.  The stopping at intervals and the intent face would correspond to any of the multiple uses to which an Internet-enabled smart phone may be put:  reading or sending a text message, downloading songs, taking or sending pictures and video, updating social media profiles, or finding directions with GPS, to name but a few.  What is striking is how often these activities would, like that of the ancient rhetor, involve the work of memory.   Much of what cell phones are increasingly used for has very little to do with making a phone call, after all. In fact, one could argue that the calling feature of phones is becoming largely irrelevant.  Cell phones are more likely to be used to access the Internet, send a text message, take a picture, or film a video.  Given these capabilities cell phones have become prosthetic memory devices; to lose a cell phone would be to induce a state of partial amnesia.  Or, it may be better to say it would might induce a fear of future amnesia since our ability to approach the present as a field of potential memories would be undermined.

Social networking sites (SNS) are of special interest because of the way in which they explicitly trade in memory.  As Joanne Garde-Hansen asks in “MyMemories?:  Personal Digital Archive Fever and Facebook,” “If memory and identity can be seen as coterminous and SNSs involve literally digitising ones’ self into being then what is at stake when memories and identities are practiced on Facebook?”

She goes on to add,

It goes without saying that the allure of the site is in its drawing together in one place memory practices: creating photograph albums, sharing photographs, messaging, joining groups and alumni memberships, making ‘friends’ and staying in touch with family.

It would be fair to acknowledge that SNS such as Facebook traffic in more than the allure of memory practices. Nonetheless, the production, maintenance, and retrieval of memory is integral to the practices deployed on SNSs.

Following Jacques Derrida, Garde-Hansen considers Facebook as an instance of the archiving archive. Thus, she points out, the architecture of a SNS such as Facebook is not neutral with respect to the memories it archives.  As Derrida observed,

… the technical structure of the archiving archive also determines the structure of the archivable content even in its very coming into existence and in its relationship to the future.  The archivization produces as much as it records the event.

Garde-Hansen also draws SNSs into the conflict between database and narrative staged by Manovich in The Language of New Media.  In her view, the most significant aspect of Manovich’s analysis of new media for SNSs is the comparison he draws between the visual organization of new media interfaces and spatial montage.  “Manovich’s emphasis upon spatial narrative is,” according to Garde-Hansen, “extremely pertinent to thinking through the emergence of SNSs and how these sites remediate personal and collective memory.” Framed in this way, memory as spatial montage challenges “the rise and dominance of history,” the “power of the written word” to order the past temporally, and the “twentieth century’s emphasis upon the single, unadulterated image (think cinema).”

Derrida’s insight suggests that given the way the architecture of an archive already determines what can in fact be archived, the future record of the past is already impinging upon the present.  Or, put otherwise, the sorts of memories we are able to generate with social media may already be directing our interactions in the present.  (For an insightful discussion of this point anchored on an analysis of faux-vintage photography see Nathan Jurgenson’s, “The Faux-Vintage Photo.”)  Drawing in Manovich’s database/narrative opposition, suggests that the visual/spatial mode of ordering memories on SNSs potentially shifts how meaning is derived from memory and how we construct the self.  We’ll explore both of these considerations a little further in subsequent posts.

Returning to the scene suggested by Yates, however, we may also consider SNSs such as Facebook as instances of new artificial memory spaces constructed to supplement and augment the natural memory.  Already in the artificial memory tradition we have memory rendered spatially and visually in a manner that anticipates the representation and organization of memory in SNSs.  Situating SNSs within the long history of spatial and visual memory also affords us the opportunity to consider SNSs in the context of a complex and rich tradition of philosophical reflection.

What emerges is a history of memory practices that alternate between a Platonic focus on memory as the presence of an absence and an Aristotelian emphasis on memory as the record of the past.  There are several thematic threads that weave this story together including the opposition of internal memory to memory supported by inscription, the connection between memory and imagination, memory as the index of desire, the related tensions between space and time and databases and narratives, and the relationship of memory to identity.  Yet for all the complexity those themes introduce, we will begin our next post in this series with a story.

___________________________________________________

Joanne Garde-Hansen’s article is found in Save As… Digital Memories.

Studied Responses: Reactions to bin Laden’s Death

Image: CNN Belief Blog

In the moments, hours, and days following the announcement of Osama bin Laden’s death I was repeatedly struck by the amount of attention paid to the manner in which Americans were responding to his death.  Almost immediately I began to pick up notes of concerned introspection about the response (e.g., the jubilant crowds gathered at the White House and Ground Zero), and what ought to be the appropriate response.

This introspection appears to have been most pronounced within religious circles.  At Christianity Today, Sarah Pulliam Bailey gathered together Tweets from a number of evangelical Christian leaders and bloggers addressing the question, “How should Christians respond to Osama bin Laden’s death?”  A sizable comment thread formed below the post.  At the religion and media web site, Get Religion, in a post titled “Churches respond to Osama’s death,” we get another round of links to church leaders writing about the appropriate response to the killing of bin Laden.

The topic, however, was also prominent in the more mainstream media.  NPR, for example, ran a short piece titled “Is It Wrong to Celebrate Bin Laden’s Death” and another piece focused on bin Laden’s death titled “Is Celebrating Death Appropriate?”  In the former story we get the following odd piece of reflection:

Laura Cunningham, a 22-year-old Manhattan reveler — gripping a Budweiser in her hand and sitting atop the shoulders of a friend — was part of the crowd at ground zero in the wee hours Monday. As people around her chanted “U-S-A,” Cunningham was struck by the emotional response. She told New York Observer: “It’s weird to celebrate someone’s death. It’s not exactly what we’re here to celebrate, but it’s wonderful that people are happy.”

I say “odd,” because it is not clear that this young lady knew what or why she was celebrating.  “But it’s wonderful that people are happy”?  What?

The NY Times also ran a story titled, “Celebrating a Death: Ugly, Maybe, but Only Human.”  And, finally, in case you are interested, Noam Chomsky would also like you to know about his reaction to Osama’s death, although I imagine you can guess.  Additionally, at CNN’s Belief Blog, you can read “Survey:  Most Americans say its wrong to celebrate bin Laden’s death,” and Stephen Prothero’s reflections on the aforementioned survey.  You get the idea.

So all of this strikes me as rather interesting.  For one thing, I can’t really imagine this sort of self-awareness permeating the responses of previous generations to historical events of this sort.  Of course, this may be because this event is sui generis, although I doubt that is quite right.  It seems rather another instance of the self-reflexiveness and self-reference that has become a characteristic of our society.  I might push this further by noting that this post just adds another layer, another mirror, as I reflect on the reflections.  My usual explanation for this hypertrophied self-awareness is the collapse of taken-for-granted social structures and customs and the correlated rise of the liberated, spontaneous self.  The spontaneous self as it turns out is not that spontaneous; rather it is performed.  Performance is studied and aware of itself; conscious of its every response.  Naturally then, we are asking at the cultural level whether our “spontaneous” celebrations were appropriate.  Did we play this part right?

This posture seems to me to lack a certain degree of integrity, in the sense that our way of being in the world is not integrated; very little comes naturally, our actions all feel rather artificial.  Perhaps especially at those times when we most wish we could just be fully in the moment, we rather feel a certain anxiety about feeling the right way — are we feeling the way we are supposed to be feeling, etc.  However, the integrated self is also somewhat opaque to itself; it is capable of acting literally without thought, and thus perhaps thoughtlessly.

I’ll resist the temptation to provide a concluding paragraph that wraps things up neatly with a fresh insight.  More of an aspiration than a temptation, I suppose, if the insight just isn’t there.