The Surveilled Student

Here are but two paragraph from one installment of Audrey Watters’s year-end review of the top education technology stories.

“Helicopter parenting” – or at least parental anxiety – might not be a new phenomenon, but it is now increasingly able to enlist new technologies to monitor children’s activities. A story this summer in The New York Magazine is, no doubt, an extreme example of this: “Armed with Nest Cams and 24/7 surveillance, one company promises to fix even the most dysfunctional child – for a price.” But many, many technology products boast features that allow parents to check up on what their kids are doing – what they’re reading, what they’re watching, what they’re clicking onwhat they’re saying, who they’re talking tohow many steps they’re taking, where they’re at at any given money, and so on. It’s all under the auspices, of course, of keeping kids safe.

This all dovetails quite neatly, as I noted in the article on education data, with the ways in which schools too are quite willing to surveil students. The New York Times family section cautioned in August about “The Downside of Checking Kids’ Grades Constantly.” But the expectation of many ed-tech products (and increasingly school policy) is that parents will do just this – participate in the constant monitoring of student data.

I pass it along to you for a couple of reasons. First, this paragraph, and the links it contains, touch on an important dimension of what it means to be a parent in the digital age, a topic I’ve been thinking a lot about over the last few months.

I also pass it along to you in the event that you’ve not yet come across Audrey Watters’s work. She is a terrific guide to all things related to education and technology. If you are at all interested in how technology is brought to bear on the task of educating our children, and really all of us should be at least somewhat interested, then you would do well to keep up with Watters. You should definitely read the Education Technology and the New Behaviorism.

Facebook Doesn’t Care About Your Children

Facebook is coming for your children.

Is that framing too stark? Maybe it’s not stark enough.

Facebook recently introduced Messenger Kids, a version of their Messenger app designed for six to twelve year olds. Antigone Davis, Facebook’s Public Policy Director and Global Head of Safety, wrote a blog post introducing Messenger Kids and assuring parents the app is safe for kids.

“We created an advisory board of experts,” Davis informs us. “With them, we are considering important questions like: Is there a ‘right age’ to introduce kids to the digital world? Is technology good for kids, or is it having adverse affects on their social skills and health? And perhaps most pressing of all: do we know the long-term effects of screen time?”

The very next line of Davis’s post reads, “Today we’re rolling out our US preview of Messenger Kids.”

Translation: We hired a bunch of people to ask important questions. We have no idea what the answers may be, but we built this app anyway.

Davis doesn’t even attempt to fudge an answer to those questions. She raises them and never comes back to them again. In fact, she explicitly acknowledges “we know there are still a lot of unanswered questions about the impact of specific technologies on children’s development.” But you know, whatever.

Naturally, we’re presented with statistics about the rates at which children under 13 use the Internet, Internet-enabled devices, and social media. It’s a case from presumed inevitability. Kids are going to be online whether you like it or not, so they might as well use our product. More about this in a moment.

We’re also told that parents are anxious about their kid’s safety online. Chiefly, this amounts to concerns about privacy or online predators. Valid concerns, of course, and Facebook promises to give parents control over their kids online activity. However, safety, in this sense, is not the only concern we should have. A perfectly safe technology may nonetheless have detrimental consequences for our intellectual, moral, and emotional well-being and for the well-being of society when the technology’s effects are widely dispersed.

Finally, we’re given five principles Facebook and its advisory board developed in order to guide the development of their suite of products for children. These are largely meaningless sentences composed of platitudes and buzzwords.

Let’s not forget that this is the same company that “offered advertisers the opportunity to target 6.4 million younger users, some only 14 years old, during moments of psychological vulnerability, such as when they felt ‘worthless,’ ‘insecure,’ ‘stressed,’ ‘defeated,’ ‘anxious,’ and like a ‘failure.'”

Facebook doesn’t care about your children. Facebook cares about your children’s data. As Wired reported, “The company will collect the content of children’s messages, photos they send, what features they use on the app, and information about the device they use.”

There are no ads on Messenger Kids the company is quick to point out. “For now,” I’m tempted to add. Barriers of this sort tend to erode over time. Moreover, even if the barrier holds, an end game remains.

“If they are weaned on Google and Facebook,” Jeffrey Chester, executive director for the Center of Digital Democracy, warns, “you have socialized them to use your service when they become an adult. On the one hand it’s diabolical and on the other hand it’s how corporations work.”

Facebook’s interest in producing an app for children appears to be a part of a larger trend. “Tech companies have made a much more aggressive push into targeting younger users,” the same Wired article noted, “a strategy that began in earnest in 2015 when Google launched YouTube Kids, which includes advertising.”

In truth, I think this is about more than just Facebook. It’s about thinking more carefully about how technology shapes our children and their experience. It is about refusing the rhetoric of inevitability and assuming responsibility.

Look, what if there is no safe way for seven-year-olds to use social media or even the Internet and Internet-enabled devices? I realize this may sound like head-in-the-ground overreaction, and maybe it is, but perhaps it’s worth contemplating the question.

I also realize I’m treading on sensitive ground here, and I want to proceed with care. The last thing over-worked, under-supported parents need is something more to feel guilty about. Let’s forget the guilt. We’re all trying to do our best. Let’s just think together about this stuff.

As adults, we’ve barely got a handle on the digital world. We know devices and apps and platforms are designed to capture and hold attention in a manner that is intellectually and emotionally unhealthy. We know that these design choices are not made with the user’s best interest in mind. We are only now beginning to recognize the personal and social costs of our uncritical embrace of constant connectivity and social media. How eager should we be to usher our children in to this reality?

The reality is upon them whether we like it or not, someone might counter. Maybe, but I don’t quite buy it. Even if it is, the degree to which this is the case will certainly vary based in large part upon the choices parents make and their resolve.

Part of our problem is that we think too narrowly about technology, almost always in terms of functionality and safety. With regards to children, this amounts to safeguarding against offensive content, against exploitation, and against would-be predators. Again, these are valid concerns, but they do not exhaust the range of questions we should be asking about how children relate to digital media and devices.

To be clear, this is not only about preventing “bad things” from happening. It is also a question of the good we want to pursue.

Our disordered relationship with technology is often a product of treating technology as an end rather than a means. Our default setting is to uncritically adopt and ask questions later if at all. We need, instead, to clearly discern the ends we want to pursue and evaluate technology accordingly, especially when it comes to our children because in this, as in so much else, they depend on us.

Some time ago, I put together a list of 41 questions to guide our thinking about the ethical dimensions of technology. These questions are a useful way of examining not only the technology we use but also the technology to which we introduce our children.

What ideals inform the choices we make when we raise children? What sort of person do we hope they will become? What habits do we desire for them cultivate? How do we want them to experience time and place? How do we hope they will perceive themselves? These are just a few of the questions we should be asking.

Your answers to these questions may not be mine or your neighbor’s, of course. The point is not that we should share these ideals, but that we recognize that the realization of these ideals, whatever they may be for you and for me, will depend, in greater measure than most of us realize, on the tools we put in our children’s hands. All that I’m advocating is that we think hard about this and proceed with great care and great courage. Great care because the stakes are high; great courage because merely by our determination to think critically about these matters we will be setting ourselves against powerful and pervasive forces.


The Paradoxes of Digitally Mediated Parenting

The novelist Karen Russell recently reflected on her experience as a new parent with a baby monitor, one that streams footage directly to a smartphone app and sends notifications whenever it registers certain kinds of movement in the room.

“I’ve become addicted to live-streaming plotless footage of our baby,” Russell admits, but her brief, poetic essay ends with a reflection on the limitations of such pervasive surveillance and documentation:

“Children vanish without dying,” Joy Williams wrote. Every time the app refreshes and shows an empty crib, I feel a stab of surprise. Children do endure in space and time, but they’re always changing, and no camera is sensitive enough to record the uncanny speed at which this transformation happens. Already the baby has doubled in size. “A slow-motion instant,” a friend and veteran parent told me, describing how the years would now pass. A camera is a tool that spools up time, but of course it cannot stop it.

This paragraph reveals a paradox at the heart of our obsessive documentation. When our documentation is motivated, as it so often is, by a rebellion against the unremitting passage of time, it will only accelerate the rate at which we experience time’s passing. If a camera spools up time without stopping it, then those same spools of time, the moments we have captured and horded, heighten our awareness of time’s ephemeral and fleeting nature.

It is striking, upon reflection, that we use the word capture to describe what we think we are doing when we visually document a moment. That we seek to capture these moments, these experiences, or, even more straightforwardly, these memories–as if we wanted to bypass the experience altogether and pass directly to its recollection … that capturing is what we think we are doing discloses our apprehension of time as something wild and unruly. We seek to master time, but it refuses to be domesticated.

Upon further reflection, it is also striking that it is a moment and not, say, a scene that we think we are capturing. That time and not space is the default category by which we understand what an image records suggests the true nature of the desires driving so much of our documentation. This is why we can never satisfactorily recreate a photograph. It was never the external and physical facts that we sought to document in the first place. It was more like a river of commingled sensation and emotion, one into which we can most certainly never step twice.

Concluding her essay, Russel writes, “When I’m unable to sleep, I can watch our baby. I am watching right now. I can see the bottoms of his feet and count his ten toes, virtually ruffle the pale cap of his hair. He is breathing, I am almost certain. Go in and check, says the Dark Voice of Unreason. Go in and touch him.”

And with these words, a second paradox emerges: we monitor in order to relieve anxiety, but our anxiety is heightened by our monitoring. Put another way, we will grow anxious about whatever we are able to monitor. We will monitor all the more insistently to relieve this anxiety, and our anxiety will intensify in turn.

There is nothing new about the anxieties parents feel when it comes to the security of their children, of course. And I suspect most parents have always felt a bittersweet joy in watching their children grow: saddened by the loss of all their child has been but gladdened by who they are becoming. But I do wonder whether or not these experiences are heightened and amplified by the very tools we deploy to overcome them.

Since Bacon’s day at least, we turn to technology for “the relief of man’s estate.” At what point, however, does the reasonable desire to alleviate human suffering morph into a misguided quest to escape the human condition altogether? Finding whatever joy and contentment we may reasonably aspire to in this life seems to depend on our answer to this question (at least, of course, in affluent and prosperous societies).

It is, to be sure, a line that is difficult to perceive and wisely navigate. But when our techniques yield only a heightened experience of the very disorders we seek to ameliorate, we may justly wonder whether we have crossed it.


This post is part of a series on being a parent in the digital age.

Growing Up with AI

In an excerpt from her forthcoming bookWho Can You Trust? How Technology Brought Us Together and Why It Might Drive Us Apart, Rachel Botsman reflects on her three-year-old’s interactions with Amazon’s AI assistant, Alexa.

Botsman found that her daughter took quite readily to Alexa and was soon asking her all manner of questions and even asking Alexa to make choices for her, about what to wear, for instance, or what she should do that day. “Grace’s easy embrace of Alexa was slightly amusing but also alarming,” Botsman admits. “Today,” she adds, “we’re no longer trusting machines just to do something, but to decide what to do and when to do it.” She then goes on to observe that the next generation will grow up surrounded by AI agents, so that the question will not be “Should we trust robots?” but rather “Do we trust them too much?”

Along with issues of privacy and data gathering, Botsman was especially concerned with the intersection of AI technology and commercial interests: “Alexa, after all, is not ‘Alexa.’ She’s a corporate algorithm in a black box.”

To these concerns, philosopher Mark White, elaborating on Botsman’s reflections, adds the following:

Again, this would not be as much of a problem if the choices we cede to algorithms only dealt with songs and TV shows. But as Botsman’s story shows, the next generation may develop a degree of faith in the “wisdom” of technology that leads them to give up even more autonomy to machines, resulting in a decline in individual identity and authenticity as more and more decisions are left to other parties to make in interests that are not the person’s own—but may be very much in the interests of those programming and controlling the algorithms.

These concerns are worth taking into consideration. I’m ambivalent about framing a critique of technology in terms of authenticity, or even individual identity, but I’m not opposed to a conversation along these lines. Such a conversation at least encourages us to think a bit more deeply about the role of technology in shaping the sorts of people we are always in the process of becoming. This is, of course, especially true of children.

Our identity, however, does not emerge in pristine isolation from other human beings or independently from the fabric of our material culture, technologies included. That is not the ideal to which we should aim. Technology will unavoidably be part of our children’s lives and ours. But which technologies? Under what circumstances? For what purposes? With what consequences? These are some of the questions we should be asking.

Of an AI assistant that becomes part of a child’s taken-for-granted environment, other more specific questions also come to mind.

What conversations or interactions will the AI assistant displace?

How will it effect the development of a child’s imagination?

How will it direct a child’s attention?

How will a child’s language acquisition be effected?

What expectations will it create regarding the solicitude they can expect from the world?

How will their curiosity be shaped by what the AI assistant can and cannot answer?

Will the AI assistants undermine the development of critical cognitive skills by their ability to immediately respond to simple questions?

Will their communication and imaginative life shrink to the narrow parameters within which they can interact with AI?

Will parents be tempted to offload their care and attentiveness to the AI assistant, and with what consequences?

Of AI assistants generally, we might conclude that what they do well–answer simple direct questions, for example–may, in fact, prove harmful to a child’s development, and what they do poorly–provide for rich, complex engagement with the world–is what children need most.

We tend to bend ourselves to fit the shape of our tools. Even as tech-savvy adults we do this. It seems just as likely that children will do likewise. For this reason, we do well to think long and hard about the devices that we bring to bear upon their lives.

We make all sorts of judgements as a society about when it is appropriate for children to experience certain realities, and this care for children is one of the marks of a healthy society. We do this through laws, policy, and cultural norms. With regards to the norms that govern the technology that we introduce into our children’s lifeworld, we would do well, it seems to me, to adopt a more cautionary stance. Sometimes this means shielding children from certain technologies if it is not altogether obvious that their impact will be helpful and beneficial. We should, in other words, shift the burden of proof so that a technology must earn its place in our children’s lives.

Botsman finally concluded that her child was not ready for Alexa to be a part of her life and that it was possibly usurping her own role as parent:

Our kids are going to need to know where and when it is appropriate to put their trust in computer code alone. I watched Grace hand over her trust to Alexa quickly. There are few checks and balances to deter children from doing just that, not to mention very few tools to help them make informed decisions about A.I. advice. And isn’t helping Gracie learn how to make decisions about what to wear — and many more even important things in life — my job? I decided to retire Alexa to the closet.

It is even better when companies recognize some of these problem and decide (from mixed motives, I’m sure) to pull a device whose place in a child’s life is at best ambiguous.


This post is part of a series on being a parent in the digital age.

What Do I See When I See My Child?

An entry in a series on the experience of being a parent in the digital age. 

At first glance, this may seem like a question with an obvious and straightforward answer, but it isn’t. Vision plays a trick on us all. It offers its findings to us as a plain representation of “what is there.” But things are not so simple. Most of us know this because at some point our eyes have deceived us. The thing we thought we saw was not at all what was, in fact, there. Even this cliche about our eyes deceiving us reveals something about the implicit trust we ordinarily place in what our eyes show to us. When it turns out that our trust has been betrayed we do not simply say that we were mistaken–we speak as if we have been wronged, as if our eyes have behaved immorally. We are not in the habit, I don’t think, of claiming that our ears deceived us or our nose.

What we ordinarily fail to take into account is that seeing is an act of perception and perception is a form of interpretation.

Seeing is selective. Upon glancing at a scene, I’m tempted to think that I’ve taken it all in. But, of course, nothing could be further from the truth. If I were to look again and look for a very long time, I would continue to see more and more details that I did not see at first, second, or third glance. Whatever it was that I perceived when I first looked is not what I will necessarily see if I continue to look; at the very least, it will not be all that I will see. So why did I see what I saw when first I looked?

Sometimes we see what we think we ought to see, what we expect to see. Sometimes we see what we want to see or that for which we are looking. Seeing is thus an act of both remembering and desiring. And this is not yet to say anything of the meaning of what we see, which is also intertwined with perception.

It is also the case that perception is often subject to mediation and this mediation is ordinarily technological in nature. Indeed, one of the most important consequences of any given technology is, in my view, how it shapes our perception of the world. But we are as tempted to assume that technology is neutral in its mediations and representations as we are to believe that vision simply shows us “what is there.” So when our vision is technologically mediated it is as if we were subject to a double spell.

The philosopher Peter-Paul Verbeek, building on the work of Don Ihde, has written at length about what he has called the ethics of technological mediation. Technologies bring about “specific relations between human beings and reality.” They do this by virtue of their role in mediating both our perception of the world and our action in the world.

According to Ihde, the mediating work of technology comes in the form of two relations of mediation: embodiment relations and hermeneutic relations. In the first, tools are incorporated by the user and the world is experienced through the tool. Consider the blind man’s stick an example of an embodiment relation; the stick is incorporated into the man’s body schema.

Verbeek explains hermeneutic relations in this way: “technologies provide access to reality not because they are ‘incorporated,’ but because they provide a representation of reality, which requires interpretation.” Moreover, “technologies, when mediating our sensory relationship with reality, transform what we perceive. According to Ihde, the transformation of perception always has the structure of amplification and reduction.”

We might also speak of how technological mediation focuses our perception. Perhaps this is implied in Ihde’s two categories, amplification and reduction, or the two together amount to a technology’s focusing effect. We might also speak of this focusing effect as a directing of our attention.

So, once again, what do I see when I see my child?

There are many technologies that mediate how I perceive my child. When my child is in another room, I perceive her through a video monitor. When my child is ill, I perceive her through a digital thermometer, some which now continuously monitor body temperature and visualize the data on an app. Before she was born, I perceived her through ultrasound technology. When I am away from home, I perceive her through Facetime. More examples, I’m sure, may come readily to your mind. Each of these merits some attention, but I set them aside to briefly consider what may be the most ubiquitous form of technological mediation through which I perceive my child–the digital camera.

Interestingly, it strikes me that the digital camera, in particular the camera with which our phones are equipped, effects both an embodiment relation and a hermeneutic relation. I fear that I may be stretching the former category to make this claim, but I am thinking of the smartphone as a device which, in many respects, functions as a prosthesis. I mean by this that it is ready-to-hand to such a degree that it is experienced as an appendage of the body and that, even when it is not in hand, the ubiquitous capacity to document has worked its way into our psyche as a frame of mind through which we experience the world. It is not only the case that we see a child represented in a digital image, our ordinary act of seeing itself becomes a seeing-in-search-of-an-image.

What does the mediation of the digital smartphone camera amplify? What does it reduce? How does it bring my child into focus? What does it encourage me to notice and what does it encourage me to ignore? What can it not account for?

What does it condition me to look for when I look at my child and, thus, how does it condition my perception of my child?

Is it my child that I see or a moment to be documented? Am I perceiving my child in herself or am I perceiving my child as a component of an image, a piece of the visual furniture?

What becomes of the integrity of the moment when seeing is mediated through an always-present digital camera?

How does the representation of my child in images that capture discreet moments impact my experience of time with my child? Do these images sustain or discourage the formation of a narrative within which the meaning of my relationship with my child emerges?

It is worth noting, as well, that the smartphone camera ordinarily exists as one component within a network of tools that includes the internet and social media tools. In other words, the image is not merely a record of a moment or an externalized memory. It is also always potentially an act of communication. An audience–on Facebook, Twitter, Instagram, Youtube, Snapchat, etc.–is everywhere with me as an ambient potentiality that conditions my perception of all that enters into my experience. Consequently, I may perceive my child not only as a potential image but as a potential image for an audience.

What is the nature of this audience? What images do I believe they care to see? What images do I want them to see? From where does my idea of the images they care to see arise? Do they arise from the images I see displayed for me as part of another’s audience? Or from professional media or commercial marketing campaigns? Are these the visual patterns I remember, half-consciously perhaps, when my perceiving takes the on the aspect of seeing-as-expectation? Do they form my perception-as-desire? For whom is my child under these circumstances?

I have raised many questions, which I have left unanswered. I leave these questions unanswered chiefly because whatever my answers may be, they are not likely to be your answers. And the value of these questions lies in the asking and not in the particular answers that I might give to them. Regardless of the answers we give, the questions are worth asking for what they may reveal as we contemplate them.


If you’ve appreciated what you’ve read, consider supporting the writer.