Worst experience
Schools use dubious AI-based software to accuse teens of harming themselves and send the police home – often with chaotic and traumatic consequences.
Like the New York Times According to reports, software installed on high school students’ school-issued devices tracks every word they type. An algorithm then analyzes the language looking for clues that teens want to harm themselves.
Unsurprisingly, the software can get it wrong by woefully misinterpreting what the students are actually trying to say. For example, a 17-year-old boy from Neosho, Missouri was woken up by police in the middle of the night.
It turns out that a poem she wrote years ago set off the alarms of a software called GoGuardian Beacon, which its creator describes as a way to “protect students from physical harm.”
“It was one of the worst experiences of her life,” the teen’s mother told the newspaper NOW.
Welfare check
Internet safety software used by educational technology companies took off during the COVID-19 shutdowns, leading to widespread surveillance of students in their own homes.
Many of these systems are designed to flag keywords or phrases to find out if a teen is planning to hurt themselves.
But like the NOW reports, we have no idea if they are effective or accurate at all, as the companies have not yet released any data.
In addition to false alarms, schools to have reported that the systems have enabled them to intervene in time before they face imminent risk at least some of the time.
However, the software remains highly invasive and can represent a huge invasion of privacy. Civil rights groups have criticized the technology, arguing that law enforcement should not be involved in most cases, the organization said. NOW.
In short, is this really the best weapon against teen suicides, which have become the second leading cause of death in the US among people aged five to 24?
“There are a lot of false alerts,” Ryan West, the police chief in charge of the 17-year-old’s school, told police. NOW. “But if we can save one child, that’s worth a lot of false alerts.”
However, others often disagree with that assessment.
“Given the complete lack of information about the outcomes, it’s not really possible for me to evaluate the use of the system,” Baltimore City Councilman Ryan Dorsey, who has criticized these systems in the past, told the newspaper. “I think sending police — especially knowing what I know and believe about school police in general — into children’s homes is terribly misguided.”
More about AI: Suspected murderer of insurance CEO studied artificial intelligence and talked about ‘singularity’