Top Stories
AI and Police Reports Slammed
December 18, 2024 posted by Steve Brownstein
ACLU Slams AI Police Reports, and Axon in Particular
Axon, best known for its Tasers, is trying to sell its relatively new Draft One software to police. A new ACLU report advises police to avoid AI for crafting reports — and an Axon competitor weighs in.
The practice of using AI to write police reports has come under attack from the American Civil Liberties Union (ACLU) — and its new paper on the issue could end up influencing competition among suppliers of technology to law enforcement.
The report is a fusillade against Axon, which makes body cameras and other products, along with software for AI-powered police reports. Axon has faced controversy before about its products. In 2022, most of its AI Ethics Board resigned over the company’s plan to put Tasers on drones.
But the nearly century-old civil rights organization goes beyond criticism of that one company and says that police departments should avoid using artificial intelligence in most instances when crafting police reports.
Many officers consider that task as among the most time-consuming chores they have, providing an opening to sellers of government technology. The need to be legally and factually precise — to create writing that will withstand scrutiny in court — adds pressure to the process. Not only that, but writing doesn’t come naturally to all police officers.
That’s why gov tech companies such as Axon have developed products designed to make that part of policing more efficient, using generative AI as a prompt and automation tool for police reports.
The ACLU report, “Police Departments Shouldn’t Allow Officers to Use AI to Draft Police Reports,” takes direct aim at Axon to make the case that AI can be sloppy and biased when used to guide the report-writing process.
The ACLU bases its argument around several common criticisms of AI found in all industries: potential unreliability and bias, a lack of transparency around AI learning models and inputs, and the absence of privacy protections.
In its paper, the ACLU anchored those concerns in the group’s analysis of Axon’s Draft One.
The product takes audio files from body cameras, runs them through OpenAI’s GPT-4 large language model and produces what the ACLU report calls a “first-person narrative for the officer in the typical format of a police report.”
Described by Axon as a “force multiplier” because, the company says, officers can spend up to 40 percent of their time on reports, Draft One prompts officers to insert specific details such as speed limits and driver’s license status to complete the famously dry narratives.
As the ACLU analysis notes, Draft One also includes what amounts to a safety valve: The random insertion of “silly sentences” — say, a mention of a flying squirrel — into the reports to ensure that officers fix it. The idea is to make sure police officers are really reading and checking the veracity of these AI-generated reports.
The ACLU, citing a comment from Axon leadership, is skeptical that all police departments would use that safeguard. The group also worries that AI could simply make things up and “absorb the racism, sexism and other biases” picked up by AI as it takes in vast amounts of data from the digital world.
Data from body cameras also could provide less-than-complete data, depending on the sensitivity and placement of the microphone and other factors, the ACLU says. And since corporate use of AI training models are not subject to Freedom of Information requests, the civil rights group worries about the ability of defendants to fully challenge one of the tools involved in their prosecutions.
The ACLU did not respond to a request for comment about the report.
Earlier this year, Draft One underwent trials in Colorado and Louisiana and is now widely available in the U.S., an Axon spokesperson tells Government Technology via email. Police in Maine and California also are interested in buying the product, a type of purchase that often requires a long lead time and substantial civic debate.
For police in Fort Collins, Colo., the tool resulted in a 67 percent reduction in the time spent on report writing, which in turn frees up officers for more street duty, the spokesperson said, adding that “success stories” about the product “mostly center on how much quicker officers can complete their paperwork while maintaining quality.”
The Axon spokesperson did not directly dispute any facts from the ACLU report but painted a picture of a solid, safe and supervised product.
“Critical safeguards require every report to be edited, reviewed and approved by a human officer, ensuring accuracy and accountability of the information,” the spokesperson said, adding that the company asks ethical and other experts to provide feedback and testing.
The spokesperson emphasized that Draft One requires officers to “review, edit and sign off on [the] accuracy” of their reports, and that use of the software leaves a digital audit trail.
As well, use of the tool for now is “restricted” to minor incidents that do not include felonies and arrests, so that client agencies can get “comfortable with the tool before expanding to more complex cases.”
The company’s double-blind study of the tool showed that Draft One produced equal or better reports than those written 100 percent by officers, at least when judged by such factors as completeness, neutrality, objectivity, terminology and coherence. The study included 24 experts from law enforcement and court systems.
Axon’s own research also found “no statistically significant racial bias toward a particular race,” the spokesperson said.
“As we look to the future, we believe that the responsible innovation of real-time operations, drones and robotics and artificial intelligence will enable public safety to observe, communicate and act like they have never done before, ultimately protecting more lives in more places,” the spokesperson said.
The ACLU report comes at a sensitive, trying time for public safety. Law enforcement still faces staffing challenges and distrust from citizens over racism, brutality and associated issues in the aftermath of the George Floyd murder and other incidents.
Meanwhile, Nashville recently rejected a police-backed video surveillance plan using Axon tech, the latest example of regulatory and political pushback against relatively advanced but controversial public safety technology.
No matter what happens, AI in policing — and report writing — seems unlikely to fade. One example of that comes from Missoula, Mont., where officials recently moved forward with a request to buy Draft One.
"I'm pretty cautious about AI and how that might look, but we will consider it," said Missoula Police Chief Mike Colyer, according to the report.
The ACLU report has the potential to shape how gov tech suppliers try to stand out in the crowded public safety tech space, where competition is fierce and AI is making its way to even small departments.
For instance, Truleo, an Axon competitor, all but welcomed the ACLU’s findings, even though Truleo uses AI for reports, too.
CEO and co-founder Anthony Tassone says his company’s voice assistant product offers more safeguards than Draft One.
Officers use Truleo to dictate a narrative of an incident — something they can do while driving — and the company’s technology then uses AI to “enhance” that information and come back with “suggestions.” Officers then make edits and finish the report on their own.
Truleo had already positioned itself as the “ethical” alternative to Axon, and Tassone repeated that point during an interview with Government Technology conducted after the release of the ACLU report.
In his view, a “weapons manufacturer” — Axon sells Tasers — should not be in the business of AI-generated reports, as that can lead to conflicts of interests in the case of mishaps or fatalities. The AI could help an officer or department to basically clean up a report in favor of law enforcement.
Another pitch used by Truleo is that Axon’s AI goes too far, and that body-cam transcripts can be severely unreliable.
“They are asking AI to make determinations,” he said, adding that Truleo has opened its AI to “random studies” and privacy checks. “You can’t ask AI to properly attribute criminality to people. That’s an officer’s job.”
The ACLU report recommends that no police department use AI to “replace the creation of a record of the officer’s subjective experience.”
But the report did leave wiggle room that would seem to offer a company such as Truleo a boost during client sales meetings and city council debates about law enforcement purchases.
The ACLU said that “safer and more limited” uses of AI could help with the “dull chore” of writing police reports.
“For example, officers could make an audio recorded verbal narrative of what took place … and computers could transcribe those accounts and perhaps perform some light cleanup and formatting to create an editable first draft,” the report states, adding that “like most people, [police] probably find it faster and easier to speak than to write.”
Story by Thad Rueter