Between a third and a half of American schoolchildren have some form of mental health monitoring software on their devices.
The police visited Maddie Cholka's house because her message about planning to take her life was flagged by GoGuardian Beacon, a monitoring software used by her school. The alert was reviewed by a human and escalated to the school's head of counseling, who called the police.
Concerns include false positives, privacy issues, compromised trust between students and faculty, and the potential for outing queer or trans kids through their searches. Additionally, there are worries about the software being used to detect violence, which could lead to unnecessary police involvement.
Schools typically handle alerts by first reviewing them through human oversight. If the alert is deemed serious, such as an 'active planning alert,' the student is pulled out of class and administered a suicide screening test. In cases where alerts occur outside school hours, law enforcement may be contacted to conduct well-child visits.
After the police intervention and hospitalization, Maddie Cholka returned home and school but continued to struggle with suicidal thoughts. She became more cautious about what she typed on her Chromebook to avoid detection by the monitoring software. Tragically, she died by suicide the following year.
The Columbia Suicide Screening Test is a six-question tool designed to assess the severity of someone's suicidal thoughts and determine if they need hospitalization. Schools use it when a student is flagged by monitoring software for active planning of self-harm.
There is a lack of independent data because the companies providing these tools are private and compete for contracts, making them reluctant to share proprietary information. Additionally, schools and districts have not demanded robust data on sensitivity, positive predictive value, or other metrics to evaluate the software's effectiveness.
In Lawrence Public Schools, students argued that the monitoring software undermined trust between them and faculty because their communications were being intercepted and filtered. They believed that allowing private conversations would foster greater trust.
Behind those cozy nights at home, thousands of employees at BP go to work every day. People producing more U.S. natural gas. People building grid-scale solar capacity. People turning landfill waste gas into pipeline-quality renewable natural gas. And people delivering all of that power where it's needed. They're part of the more than 300,000 jobs BP supports across the country. Learn more at BP.com slash investinginamerica.
Imagine being the first person to ever send a payment over the internet. New things can be scary and crypto is no different. It's new, but like the internet, it's also revolutionary. Making your first crypto trade feels easy with 24-7 support when you need it. Go to Kraken.com and see what crypto can be.
Hey, everyone. It's Lizzie here. Just a heads up that this episode discusses self-harm and suicide. So please take care when listening. By 2020, 16-year-old Maddy Cholka had been struggling for a while with her mental health.
She and her family, who lived in Neosho, Missouri, had had difficult conversations about her suicidal ideation and how to keep her safe. And one night, Angel, the mom, was woken up by police flashlights through her window, she told me. And she went to the door and there were two police officers. That's Ellen Berry, who writes about mental health for The New York Times.
And the officer said, does someone named Maddie Cholka live here? And Angel was terrified because she, her first thought was that Maddie had run away. But she had no idea sort of where this was coming from. And so she went to Maddie's room and the police came with her. And it turned out that what had happened was that Maddie was typing on her Chromebook, the Chromebook that she got from the Neosho Public Schools website.
And she had sent a message to one of her friends, a text message, saying that she was planning to take her life, that she was planning to take an overdose of... It was an anxiety medication. And this text had gone to...
A software company, an ed tech company called GoGuardianBeacon, which filters what students in that district are typing on their devices and looks for indications of self-harm. Maddie's message was flagged by the software. Then it went to a human review, then to her school's head of counseling, who read it and called the police.
The police went to Maddie's house and they got there a few minutes after she had taken a handful of pills. It probably wouldn't have been a lethal dose, her mom said, but in any case, she was very drowsy. And the mom and the police, I think there was an ambulance there, already took her to the hospital.
In recent years, more and more schools have used software like GoGuardian Beacon to monitor students' computers for signs that they might hurt themselves. Sometimes it works, but other times it's more complicated. Ellen wrote about another teenage girl. This one lived in Fairfield, Connecticut. Her school also used monitoring software, which flagged her for imminent risk of self-harm. And the police...
showed up at the house in the middle of the night. I think it was one o'clock or something. And the way they described it, this did not ring true to the parents. They didn't think that this was the case, but the police said that they just had to see the girl. So they woke up their daughter and they brought her downstairs. And she had a brief conversation with the police in which it became clear quite quickly that this was a false positive.
The software had been triggered by an old creative writing assignment. A guidance counselor had seen some of the language and was worried. The police eventually left, but not without leaving damage behind. What the parents said is that this was a deeply upsetting experience for their daughter, that it was traumatic on many different levels. Of course.
but also that they didn't really feel that they could complain to the school for a couple reasons. One is, you know, their daughter was incredibly sensitive about this, the sort of supposition that somehow she was self-harming and that the school had sent police to her house. And they thought that if they talked about it or complained about it in a public forum, that it would just draw attention to it.
But the mom also said, like, you know, that she was hesitant to complain because the guidance counselor said, well, maybe, you know, this is a way of saving lives. And there are false positives. But, you know, if you can save a life, you can probably tolerate the false positives. Today on the show, the spyware that school districts across the country are using to screen students for self-harm.
Is it a revolutionary tool to keep kids safe or a high-tech band-aid with questionable efficacy? I'm Lizzie O'Leary, and you're listening to What Next TBD, a show about technology, power, and how the future will be determined. Stick around.
It's always nice to get more for less. So here's a life hack that can automatically put some extra cash in your pocket. Discover will automatically double all the cash back you've earned on your credit card at the end of your first year with Cash Back Match. So you could get some money for this holiday season and get more next holiday season. It pays to discover. See terms at discover.com slash credit card.
This podcast is brought to you by Progressive Insurance. Do you ever think about switching insurance companies to see if you could save some cash? Progressive makes it easy. Just drop in some details about yourself and see if you're eligible to save money when you bundle your home and auto policies. The process only takes minutes and it could mean hundreds more in your pocket.
Visit Progressive.com after this episode to see if you could save. Progressive Casualty Insurance Company and affiliates. Potential savings will vary. Not available in all states. It's perhaps not surprising that the use of school-issued electronics skyrocketed during the pandemic. And with that, schools and districts added various kinds of monitoring software. Initially, the idea was to keep kids from inappropriate content. You had schools that were essentially giving students
all their students the internet. And there was a sense of responsibility. And there's also a law that requires the schools to then create some guardrails. Do we have a sense of how many kids across the country or how many schools are using some type of mental health monitoring software on these devices? It's really hard to find reliable data on this because you have
half a dozen companies that are marketing these kinds of tools to schools, I was able to sort of count enough to, I have the sense that it is around between a third and a half of U.S. school children have some kind of mental health monitoring on their devices.
Most kids and most parents who receive these devices, you know, they have some kind of agreement that spells out, yep, this is how it's going to be used. This is what's in there. When you talked to parents and children, how aware were they of what the programs could see? Well, I spoke to maybe a dozen, between a dozen and 20 parents who had had calls, who had been contacted by the school about something that they're
had typed. And I think across the board, they were not expecting to get that call. They weren't aware that this was something that could happen. And I think the reason is because
The schools do inform families about the existence of these softwares, but it's part of a tech agreement that parents sign at the beginning of the year. And if they're like me, they sort of skim past it straight to the checkbox. So the answer is I don't think parents are really aware that their children's typing on their devices is being used.
scanned necessarily. Do these kids and parents have a choice or is it, you know, you take the Chromebook home and this is what's on it, the end? So I can't speak for all the school districts in the country, but I think in many cases, the only choice available would be to say no to the Chromebook, which is not feasible for a lot of families. And I think it is the case that
poorer families are more likely to rely on the school devices because the kids may not have their own devices. And that means that the kids may do all of their searches or all of their text messages or sort of whatever they type, they're more likely to be doing it on the school devices, which means that it is more likely to be under school surveillance. Each program runs a little differently, but they're all set up to search for certain triggers that might indicate a kid in distress.
I think most of them use keywords, key phrases. So there's something kind of elementary about this. Obviously, that's going to just pick up a massive amount of data. And then it is sorted either by an algorithm.
by a machine learning powered algorithm, which attempts to kind of minimize the number of false positives. And then some of the companies use human reviewers, which presumably, you know, is an added layer, you know, to improve accuracy.
You had an anecdote in your story about Talmadge Clubs, the Neosho School District's Director of Counseling, and this moment where a student typed how to die into a Chromebook in a classroom. I wonder if you could tell me about what happened next. So Talmadge Clubs is a real believer in this system. And I spent some time with him in his community of Neosho, Missouri, where
This is a town that had had a cluster, a sort of devastating cluster of student suicides. And, you know, following that had introduced a bunch of new, a bunch of sort of changes in the school system. And one of them was using the guardian beacon system.
Anyway, that morning, it was, I guess it was the beginning of October and towel clubs had come in and he had a whole cluster of alerts, which he concluded were from, there was a health class that was doing research on suicide. So these are all, it turned out they were all false positives. But then he got something that was much more concerning and that is called an active planning alert.
So that is, you know, the highest level of urgency. Presumably it's gone through human review. And what he said is, you know, when we get one of these active planning alerts, you know, we drop everything. They will go to the classroom, pull the
child out of the classroom. And I think what they do in that school is they administer the Columbia Suicide Screening Test, which is a six-question screening tool where they ask very specific kind of questions, you know, designed to detect how serious someone's suicidal thoughts may be and whether they need, for example, hospitalization in the most extreme cases. Anyway, so in this case, they had an active planning alert and
And they pulled that child out of their class and spoke to them. What would have to happen for it to reach the level of the cops come to my house in the middle of the night? The reason that cops may make home visits is because when schools start to use this type of software, obviously during the day you can talk to the kid, call the parent. But if there is an alert in the middle of the night, school counselors...
basically aren't on duty in the middle of the night. And also they don't have access to the child. So a lot of communities forward alerts if they're considered to be very serious to law enforcement to conduct well-child visits. I think that's just because police do well-child visits. That's part of their portfolio and they're at work in the middle of the night, which is why people turn to police.
When we come back, is there any way to tell if this software actually works? This podcast is brought to you by Progressive Insurance.
You chose to hit play on this podcast today? Smart choice. Progressive loves to help people make smart choices, and that's why they offer a tool called AutoQuote Explorer that allows you to compare your Progressive car insurance quote with rates from other companies, so you can save time on the research and could enjoy savings when you choose the best rate for you. Give it a try after this episode at Progressive.com, Progressive casualty insurance company and affiliates. Not available in all states and situations, prices vary based on how you buy.
Imagine being the first person to ever send a payment over the internet. New things can be scary and crypto is no different. It's new, but like the internet, it's also revolutionary. Making your first crypto trade feels easy with 24-7 support when you need it. Go to Kraken.com and see what crypto can be.
Not investment advice. Crypto trading involves risk of loss. Cryptocurrency services are provided to U.S. and U.S. territory customers by Payward Ventures, Inc., PBI, DBA, Kraken. Visit PVI's disclosures at kraken.com slash legal slash disclosures.
Bored with your boring cardio? Stop pedaling that snooze cycle to Nowheresville and try some cardio that's actually fun. Supernatural Fitness, available on MetaQuest. Isn't that right, Jane Fonda? Cardio will never be boring again. Sweat to the beat of thousands of chart-topping songs inside stunning virtual landscapes. Bet your stationary bike can't do that. Visit GetSupernatural.com and join the next fitness revolution. Supernatural VR Fitness, only on MetaQuest. Wait a team for team.
If close to one half of American school kids are subject to some kind of monitoring software, you'd want to know how well it performs, right? Except there's a dearth of independent data that can tell us either way. I basically just asked parents and teachers to tell me, you know, what they had observed. And I did hear a number of stories and I retell some of those stories. But these are anecdotes. They're not data.
And in the absence of data, I think there's like a legitimate question, you know, when we don't know, does that mean we should just go ahead and implement it? Or does that mean we should pause before implementing it? And I think the problem of self-harm is such an urgent, it feels like such an urgent one. Schools really don't know what to do about it. And so I think in some ways, a tech solution feels like a clean solution, you know,
And, you know, during the years after the pandemic, people were, the schools were under a lot of pressure to add sort of mental health protections and services. Well, that was where I was going with this is, are these technological protections, are they layered in with, say, vaccines?
More counselors, more conversations about living through a pandemic or loss or grief or what it feels like to be a teen online. Is that stuff there? Or are we talking about a district where they said, well, we got to do something. And so we're going to install GoGuardian Beacon.
I can't answer that question because we're talking about, you know, hundreds and hundreds of districts, all of which approach this differently. But ideally, you know, a tool like this only works if you have, for example, on-site therapists or you have enough bandwidth that your counselors really build relationships. I mean, I think one hopeful solution
idea is that let's say the school starts picking up on weaker signals, that is like signals of distress that are not imminent, sort of imminent harm or imminent emergencies.
One way that I can see this being positive is that it brings kids into conversation with counselors at a, you know, before there's a point of 911 calls or ambulances or anything like that. Just sort of bring down the stakes and get them into conversation with counselors.
an adult who is just asking them how they are. And I had a conversation with one 17 year old girl who said, you know, she had been called into the counselor and the counselor, she walked in and the counselor held up a piece of paper and it was a printout of a note that she had written. It was an email she had written to a friend of hers. Um, I think it was a few days earlier and the email said that she was thinking of harming herself. And, um,
And she said, you know, that counselor has become a mom figure. Like ever since that moment, like I've been talking to that counselor and it really made things easier for me. So that's an example where, you know, maybe it wasn't an imminent emergency, but it did have a good outcome. I know this wasn't the focus of your reporting, but there's also kind of an interesting privacy question here, which goes back to the young woman in Connecticut, right?
The idea that, you know, this is also intrusive and can feel embarrassing, or even in the case of the Lawrence Public Schools in Kansas, student journalists there fought with their school district to have the software removed from their devices, saying it was violating privacy and free speech rights. Did privacy concerns come up at all when you were doing reporting for this piece?
Yeah, privacy concerns are really the elephant in the room. And there are a lot of communities that have decided against using these tools for that reason. And, you know, a couple of things that have come up regularly, there is a lot of concern that if you are conveying to parents what
a child may be searching on their device or websites they may be visiting, that I think there's a specific problem for queer or trans kids who may be somehow outed through this kind of technology. So that's like serious concern that's come up and a number of data privacy or tech privacy advocacy groups have issued real warnings about this. The other is that
Let's say you're searching for a benign reason. You're concerned about mental health. But what if you pick up language about a gun or a knife? You know, if this becomes a kind of a tool for detecting violence. Which it has been used as. Some schools say they're using this for, you know, to prevent school shootings. You are essentially putting your students into prison.
with police in a way that may be unnecessary and may, in fact, have very bad consequences for them. So that is another really very serious concern about what these do. So these are both different kinds of privacy concerns, but for sure, the students in Lawrence, Kansas,
They made a number of different arguments. They said, first of all, this is making it difficult for us to do our work because so much of, like in an art class, you know, they were being flagged for nudity or, you know. But also, like, just trying to do normal research for school, you know, this had proven to be an obstacle. They also said something interesting, which is that they thought, as far as mental health, that this...
served to undermine trust between students and faculty because their communications were being intercepted or they were being filtered or they were being sent to administrators. And that if you really want students to trust faculty, you wouldn't do it this way. You would allow them to have private conversations. There's also this weird layer of separating online life from real life.
Like there might be things that you're curious about or you want an answer to, or even if you're using, you know, internet shorthand of KMS, kill myself as a joke, like it, that might not be reflective of where you are in reality. And I wonder if administrators have struggled with that. Teenagers are...
ironic. Yeah, and hyperbolic. They're hyperbolic and ironic, and they speak in all kinds of sophisticated ways. Also, you know, kids know about algorithms. They know how to get around algorithms. So there's like a million reasons why, like training an algorithm to sort of
to distinguish between signals and noise is very difficult. It's very challenging. I mean, that said, these companies have been doing it now at high volume for a few years, and I assume that they're getting better at it. If this were entirely false positives, I think the schools would already be turning away. And that may be happening. People may be sort of
dialing down their use of this. Again, it's really hard to know because there's no single source of transparent data on this. But I spoke to a lot of school administrators and school counselors who had really positive things to say about it.
Do you think there will ever be the kind of data that you're talking about? Because on the one hand, it would be great to be able to say, here is a robust data set. You know, the efficacy of our program is confirmed by the work of these independent researchers.
On the other hand, these are private companies competing against one another for contracts. And so I wonder if there is some, you know, trade secret aspect of this where they're like, we don't want to open up our algorithm to you. Well, I really think that school districts should demand this because maybe it does work, but we haven't seen any evidence yet. And if you are reaching into family life or you're reaching into the child's home in some cases,
I think it's fair to ask for data about sensitivity, positive predictive value, all the things that make an algorithm good or effective or ineffective. So GoGuardian, Beacon, when I asked them about this, they said that there is a peer-reviewed paper on its way. It will be published, I think, in the next few months. Yeah.
But the fact remains that there's nothing out there that can tell you reliably, you know, this is the number of false positives versus good signals and lives saved. We started this story by talking about Maddie Choka. How did her story end? Yeah, so this is a really very memorable conversation with Maddie's mom, in part because it was so clear from...
talking to her just what it costs parents to take care of and try to keep safe kids who are really struggling with impulses to hurt themselves. And you could just see on her face how much it had cost her. Um, so what happened with Maddie is that, um,
After the anecdote that we described, after the police visit to her house, after she was hospitalized, she returned home and she went back to school. But she still had those thoughts. Like those impulses didn't go away. And hospitalization, you know, doesn't make those feelings go away. And she became, I think, more...
or conscious about what she was typing on her Chromebook so that she wasn't picked up by the filter anymore. And she did ultimately die of suicide the following year. Ellen Berry, thank you for your incredible reporting and for coming on to talk with me. Thank you.
And that is it for our show today. What Next TBD is produced by Evan Campbell, Patrick Fort, and Shaina Roth. Our show is edited by Paige Osborne.
Alicia Montgomery is vice president of audio for Slate, and TBD is part of the larger What Next family. And if you want to support our journalism, the number one best way to do it is to join Slate Plus. You get all your podcast episodes without ads, and there is no paywall on any Slate.com content. Just go to Slate.com slash What Next Plus to sign up. All right, we will be back next week. I'm Lizzie O'Leary. Thanks so much for listening.
When you're part of a military family, you understand sacrifice and support. So at American Public University, we honor your dedication by extending our military tuition savings to your extended family. Parents, spouses, legal partners, siblings, and dependents all qualify for APU's preferred military rate of just $250 per credit hour for undergraduate and master's level programs. American Public University, value for the whole family. Learn more at apu.apus.edu/military.
Hey, Slate listener, this is Mary Harris from over at What Next, Slate's daily news podcast. I'm here to remind you that at Slate, we are here to help you make sense of what comes next.
With Slate+, you'll get unlimited access to the tools you need to navigate this moment. That includes exclusive episodes of Amicus, which unpacks the legal battles shaping our future, and Political Gab Fest, where the biggest questions about our political landscape are tackled with depth and a sense of humor.
Plus, you'll get ad-free listening across all your favorite Slate podcasts, including my show, What Next?, where I connect the dots on the day's headlines. It's really easy to join. Subscribe to Slate Plus directly in the Apple Podcasts app or on Spotify. You can also visit slate.com slash podcast plus to get access wherever you listen. We've been here before. We can make sense of what comes next together.