Taser maker and police contractor Axon has announced a new product called “Draft One,” an AI that can generate police reports from body cam audio.
As Forbes reports, it’s a brazen and worrying use of the tech that could easily lead to the furthering of institutional ills like racial bias in the hands of police departments. That’s not to mention the propensity of AI models to “hallucinate” facts, which could easily lead to chaos and baseless accusations.
“It’s kind of a nightmare,” Electronic Frontier Foundation surveillance technologies investigations director Dave Maass told Forbes. “Police, who aren’t specialists in AI, and aren’t going to be specialists in recognizing the problems with AI, are going to use these systems to generate language that could affect millions of people in their involvement with the criminal justice system.”
“What could go wrong?” he pondered.
Axon claims its new AI, which is based on OpenAI’s GPT-4 large language model, can help cops spend less time writing up reports.
“If an officer spends half their day reporting, and […]
The use of AI over a variety of industries is part of a nonvirtuous trend of not valuing real human beings. The captains of capitalism are happy to replace people with machines as it increases profit. These machines will operate with increased autonomy and as as they proliferate there will be such a high volume of data/work product generated that they will be impossible to police for quality control. It will be only after the disaster is discovered that we’ll have to clean up the mess. After real people are harmed, with the profits already pocketed. This society is so profit driven that its nature is to be reactive to error rather than proactive to prevent it.
So now those who have been arrested, when they realize the information is wrong or biased, will also have to have an attorney overturn the lying AI report. Hopefully, after a few egregious episodes of the AI reports in arrest situations, this will be abandoned until AI is made bias-free and is unable to lie, as it has been shown.