Police departments across the country are beginning to use artificial intelligence to help with paperwork.
According to the Associated Press, AI can help police officers save time writing reports and make their jobs more efficient.
But experts warn that the new technology could introduce serious, sometimes life-changing, errors into reporting, reinforce bias and racism, and expose personal information, Politico reports.
A new AI product from technology company Axon called “DraftOne” is designed to help police officers file reports in less time than traditional methods, according to the company’s website.
“Currently, police officers spend two-thirds of their day on paperwork,” Axon says on its website. “Our AI research team is committed to reducing the time spent on paperwork by improving the efficiency and accuracy of report writing and intelligence analysis in law enforcement.”
For example, Axon says its product allows officers to automatically edit body camera footage, “allowing officers to share footage with the public more quickly while protecting the privacy of individuals captured on footage.”
The company’s AI software allows supervisors to review footage and reports “to better understand compliance and provide feedback to improve training and police-community relations.”
Axon founder and CEO Rick Smith told The Associated Press that DraftOne has had “the most positive response” of any product the company has brought to market so far, but he added that “there are certainly some concerns.”
Smith told The Associated Press that district attorneys want to know officers aren’t relying solely on AI chatbots to write their reports because they may have to testify in court about alleged crimes.
“The last thing they want is to have an officer stand and say, ‘An AI wrote that, I didn’t write that,'” Smith told the outlet.
An AP investigation of another crime-fighting AI program, ShotSpotter, found “significant flaws” in the technology. ShotSpotter is a gunshot-detection tool that uses “sensors, algorithms and artificial intelligence” to classify 14 million sounds in its database as gunshots or other types of sounds, according to its website.
In one case in which ShotSpotter evidence was presented in court, Illinois grandfather Michael Williams was jailed for more than a year in 2021 for shooting and killing a man based on audio evidence that prosecutors claimed they obtained from the AI tool, the Associated Press reported in 2022.
According to the Associated Press, the AI technology detected a loud noise as Williams’ vehicle passed through an intersection and determined that he had fired shots at the passengers inside the vehicle.
But Williams told authorities that someone in another vehicle approached his vehicle and fired shots at the passenger, according to the Associated Press. A judge ultimately dismissed the case after prosecutors said there wasn’t enough evidence to move forward.
“I’ve always wondered why they could use that technology against me with impunity,” Williams told The Associated Press.
Want the latest crime news? peopleGet breaking crime news, ongoing trial coverage and details of intriguing unsolved cases in the free True Crime newsletter.
In a variety of other applications used by lawyers, “AI creates ‘hallucinatory’ or fake cases, citations and legal arguments that appear correct but do not actually exist,” experts at law firm Freeman, Mathis & Gary wrote in an article on their website about the risks and issues associated with generative AI used by the legal profession.
This includes the precision of chatbots currently used by prosecutors and lawyers to sift through documents to find relevant evidence, draft legal memoranda and devise complex litigation strategies.” [2024]”A Stanford University study found that 75% of answers generated by an AI chatbot on sample court decisions were incorrect,” the experts wrote.
“Furthermore, AI cannot adequately address legal questions that touch on multiple practice areas. For example, a legal question that touches both immigration and criminal law may produce an accurate answer for immigration law purposes but may ignore criminal law issues and implications.”
Jonathan Parham, a former police chief in Rahway, New Jersey, told Politico that AI used by law enforcement could be useful, but precautions and safeguards must be taken.
“AI should never replace police officers, but it should augment their operational capabilities,” Parham told Politico.