The App of the Citizen app listens to the police radios, tinks a summary and publishes an alert in the process. Analysts only arrive after, to correct the faults and repair the damage. Suffice to say that users first see the gross version … errors included, as reported 404media.
An anxiety -add it that adds
The application has around 10 million users in the United States. Everyone can publish photos and videos of local events, to which are added the alerts generated by the company. For a long time, these were written by humans. But management has gradually delegated the task to algorithms.
In the United States, these “Awareness Crime” apps have multiplied in recent years, when security has become a major concern. Citizen is not the only one of the genre: other local or national platforms also propose to map crimes in real time. Their success is due to a simple promise – feeling better informed and therefore better protected – but they are regularly criticized to amplify anxiety and create a distorted vision of crime.
And necessarily, the failures accumulate. A ” vehicle collision »It becomes« Buried vehicle accident ». Mally interpreted addresses create ghost incidents. During the chases, the AI sometimes published five to ten different alerts for a single event. She could also let go of raw details (” face ), Or swing private information.
In some cases, the AI even minimized the facts: an armed robbery became a simple flight. The app can be saturated with red dots that give the impression of a besieged city, when it is sometimes the repetition of the same incident.
Citizen is not at his first slippage. In 2021, its founder had offered $ 30,000 to find an supposed incendiary … which turned out to be innocent. However, despite these pans, the app continues to seduce. The city of New York has even just launched its own official account “NYC Public Safety”, praised by the mayor Eric Adams as a tool for ” inform New Yorkers and protect their loved ones».
The promises are beautiful, but in practice it remains very wobbly. The AI is sometimes wrong in an incomprehensible way, it hallucinates. And Citizen does not facilitate the task: the platform has just laid off 13 employees, some of whom were precisely criticized for the management of quality to quality. In parallel, the company is entrusting more and more tasks to providers in Nepal or Kenya, paid 1.50 to 2 $ per hour to listen to the radio frequencies, with the interpretation errors that this implies.
🟣 To not miss any news on the Geek newspaper, subscribe to Google News and on our WhatsApp. And if you love us, .