Between a third and a half of American schoolchildren have some form of mental health monitoring software on their devices.
The police visited Maddie Cholka's house because her message about planning to take her life was flagged by GoGuardian Beacon, a monitoring software used by her school. The alert was reviewed by a human and escalated to the school's head of counseling, who called the police.
Concerns include false positives, privacy issues, compromised trust between students and faculty, and the potential for outing queer or trans kids through their searches. Additionally, there are worries about the software being used to detect violence, which could lead to unnecessary police involvement.
Schools typically handle alerts by first reviewing them through human oversight. If the alert is deemed serious, such as an 'active planning alert,' the student is pulled out of class and administered a suicide screening test. In cases where alerts occur outside school hours, law enforcement may be contacted to conduct well-child visits.
After the police intervention and hospitalization, Maddie Cholka returned home and school but continued to struggle with suicidal thoughts. She became more cautious about what she typed on her Chromebook to avoid detection by the monitoring software. Tragically, she died by suicide the following year.
The Columbia Suicide Screening Test is a six-question tool designed to assess the severity of someone's suicidal thoughts and determine if they need hospitalization. Schools use it when a student is flagged by monitoring software for active planning of self-harm.
There is a lack of independent data because the companies providing these tools are private and compete for contracts, making them reluctant to share proprietary information. Additionally, schools and districts have not demanded robust data on sensitivity, positive predictive value, or other metrics to evaluate the software's effectiveness.
In Lawrence Public Schools, students argued that the monitoring software undermined trust between them and faculty because their communications were being intercepted and filtered. They believed that allowing private conversations would foster greater trust.
Between a third and half of American schoolchildren have a form of “mental health monitoring” software on their school devices, which scans for and flags certain keywords.
While intuitively appealing, is it worth the false positives, privacy issues, and compromised trust?
Guest: Ellen Barry), mental health reporter for the New York Times.
Want more What Next TBD? Subscribe to Slate Plus to access ad-free listening to the whole What Next family and all your favorite Slate podcasts. Subscribe today on Apple Podcasts by clicking “Try Free” at the top of our show page. Sign up now at slate.com/whatnextplus) to get access wherever you listen.
Learn more about your ad choices. Visit megaphone.fm/adchoices)