Thursday, July 31, 2025

Technology (gone wild?): Police usage of AI to write crime reports: An important collaboration by The Milwaukee Independent and The Associated press warns us by the dangers posed by this rapidly developing application of AI to policing, in a story headed, "Inaccurate automation" Legal concerns surround police usage of AI Chatbots to write crime reports, noting that: "AI technology is not new to police agencies, which have adopted algorithmic tools to read license plates, recognize suspects’ faces, detect gunshot sounds, and predict where crimes might occur. Many of those applications have come with privacy and civil rights concerns and attempts by legislators to set safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use."


SEASONAL BREAK: (July 26, 2025):

Dear Readers. It's time for me take a seasonal break  and charge my batteries, so to speak, as it has been a very busy year, with over 450 posts to date, focussed on fascinating developments, cases, issues, whatever,  in the Charles Smith Blog arena, from around the world. Here's the plan: I  have pre-scheduled posts for publication during the break, and then, batteries fully charged, the rejuvenated me  (having given you a break!) will rejoin you once again with fresh material. One of my most important tasks from the outset will be to  use the Blog to unite  with others around the globe who are fighting to stop the execution of Robert Roberson set for October 16 -   an innocent man convicted by  junk science. Another, will be to continue our efforts to help  undo (as much as possible)  the damage caused by  the notorious South Australian former Chief pathologist Colin Manock, to so many innocent  individuals over the years -  including Derek Bromley, whose cases cry out for exoneration. 

Until my return, keep an eye on Jimmy Duncan's  bail hearing which is being reported  by  Richard A. Webster, on Verite News. (The Innocence Project: "A judge has already declared Jimmie Duncan "factually innocent", but prosecutors are still going after him. Now, the victim's mother is calling for his freedom at his July 22 bail hearing.")  His  most recent story as of July 25, 2025 (as of July 26, 2025) can be read at the link below:  (A verdict could come as early as Monday July 28). As he has reported: "In April, Judge Alvin Sharp set aside Duncan’s 1998 conviction for murdering 23-month-old Haley Oliveaux, his then-girlfriend’s daughter. Sharp agreed with Duncan’s attorneys that an analysis performed on the girl that purported to match marks on her body to Duncan’s teeth — a key piece of evidence in the trial — was based on discredited science.  As ProPublica and Verite News reported in March, the effort to secure Duncan’s freedom has become more urgent in recent months due to a renewed push by Gov. Jeff Landry to restart executions in the state following a decade-plus pause. Attorney Scott Greene, who is part of Duncan’s legal team, said during the hearing that Duncan has “enormous support” from his family, the community and Haley’s mother, Allison Layton Statham. Now that his conviction has been vacated, Greene argued, Duncan should be freed on bail pending another trial." 


There is indeed  so much grist for the Charles Smith Blog mill! In the meantime please keep sending me your suggestions for future posts at hlevy15@gmail.com.   (Many of our posts have been triggered by our readers); Enjoy the summer. Cheers.

Harold Levy: Publisher: The Charles Smith Blog.

—————————————————————

QUOTE OF THE DAY" “They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate,” said Axon’s founder and CEO Rick Smith, describing the new AI product — called Draft One — as having the “most positive reaction” of any product the company has introduced. “Now, there’s certainly concerns,” Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers — not solely an AI chatbot — are responsible for authoring their reports because they may have to testify in court about what they witnessed. “They never want to get an officer on the stand who says, well, ‘The AI wrote that, I didn’t,'” Smith said."

--------------------------------------------------

PASSAGE OF THE DAY: "Before trying out the tool in Oklahoma City, police officials showed it to local prosecutors who advised some caution before using it on high-stakes criminal cases. For now, it’s only used for minor incident reports that don’t lead to someone getting arrested. “So no arrests, no felonies, no violent crimes,” said Oklahoma City police Captain Jason Bussert, who handles information technology for the 1,170-officer department. That is not the case in another city, Lafayette, Indiana, where Police Chief Scott Galloway said that all of his officers can use Draft One on any kind of case and it’s been “incredibly popular” since the pilot began earlier this year. Or in Fort Collins, Colorado, where police Sgt. Robert Younger said officers are free to use it on any type of report, though they discovered it doesn’t work well on patrols of the city’s downtown bar district because of an “overwhelming amount of noise.”


-------------------------------------------------


STORY: "Inaccurate automation" Legal concerns surround police usage of AI Chatbots to write crime reports," published by The Milwaukee Independent in collaboration with the Associated Press , on July 19, 2025. (This published content was produced by Milwaukee Independent under license and in cooperation with the Associated Press (AP), the Pulitzer Prize winning independent news gathering source founded in 1846.") 


GIST: "A body camera captured every word and bark uttered as police Sgt. Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour.


Normally, the Oklahoma City police sergeant would grab his laptop and spend another 30 to 45 minutes writing up a report about the search. But this time he had artificial intelligence write the first draft.

Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilbert’s body camera, the AI tool churned out a report in eight seconds.

“It was a better report than I could have ever written, and it was 100% accurate. It flowed better,” Gilbert said. It even documented a fact he did not remember hearing — another officer’s mention of the color of the car the suspects ran from.

Oklahoma City’s police department is one of a handful to experiment with AI chatbots to produce the first drafts of incident reports. Police officers who have tried it are enthused about the time-saving technology, while some prosecutors, police watchdogs, and legal scholars have concerns about how it could alter a fundamental document in the criminal justice system that plays a role in who gets prosecuted or imprisoned.

Built with the same technology as ChatGPT and sold by Axon, best known for developing the Taser and as the dominant U.S. supplier of body cameras, it could become what Gilbert describes as another “game changer” for police work.

“They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate,” said Axon’s founder and CEO Rick Smith, describing the new AI product — called Draft One — as having the “most positive reaction” of any product the company has introduced.

“Now, there’s certainly concerns,” Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers — not solely an AI chatbot — are responsible for authoring their reports because they may have to testify in court about what they witnessed.

“They never want to get an officer on the stand who says, well, ‘The AI wrote that, I didn’t,'” Smith said.

AI technology is not new to police agencies, which have adopted algorithmic tools to read license plates, recognize suspects’ faces, detect gunshot sounds, and predict where crimes might occur. Many of those applications have come with privacy and civil rights concerns and attempts by legislators to set safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use.

Concerns about society’s racial biases and prejudices getting built into AI technology are just part of what Oklahoma City community activist Aurelius Francisco finds “deeply troubling” about the new tool.

“The fact that the technology is being used by the same company that provides Tasers to the department is alarming enough,” said Francisco, a co-founder of the Foundation for Liberating Minds in Oklahoma City.

He said automating those reports will “ease the police’s ability to harass, surveil and inflict violence on community members. While making the cop’s job easier, it makes Black and brown people’s lives harder.

Before trying out the tool in Oklahoma City, police officials showed it to local prosecutors who advised some caution before using it on high-stakes criminal cases. For now, it’s only used for minor incident reports that don’t lead to someone getting arrested.

“So no arrests, no felonies, no violent crimes,” said Oklahoma City police Captain Jason Bussert, who handles information technology for the 1,170-officer department.

That is not the case in another city, Lafayette, Indiana, where Police Chief Scott Galloway said that all of his officers can use Draft One on any kind of case and it’s been “incredibly popular” since the pilot began earlier this year.

Or in Fort Collins, Colorado, where police Sgt. Robert Younger said officers are free to use it on any type of report, though they discovered it doesn’t work well on patrols of the city’s downtown bar district because of an “overwhelming amount of noise.”

Along with using AI to analyze and summarize the audio recording, Axon experimented with computer vision to summarize what’s “seen” in the video footage, before quickly realizing that the technology was not ready.

“Given all the sensitivities around policing, around race and other identities of people involved, that’s an area where I think we’re going to have to do some real work before we would introduce it,” said Smith, the Axon CEO, describing some of the tested responses as not “overtly racist” but insensitive in other ways.

Those experiments led Axon to focus squarely on audio in the product unveiled in April during its annual company conference for police officials.

The technology relies on the same generative AI model that powers ChatGPT, made by San Francisco-based OpenAI. OpenAI is a close business partner with Microsoft, which is Axon’s cloud computing provider.

“We use the same underlying technology as ChatGPT, but we have access to more knobs and dials than an actual ChatGPT user would have,” said Noah Spitzer-Williams, who manages Axon’s AI products. Turning down the “creativity dial” helps the model stick to facts so that it “doesn’t embellish or hallucinate in the same ways that you would find if you were just using ChatGPT on its own,” he said.

Axon refused to say how many police departments are using the technology. It is not the only vendor, with startups like Policereports.ai and Truleo pitching similar products. But given Axon’s deep relationship with police departments that buy its Tasers and body cameras, experts and police officials expect AI-generated reports to become more ubiquitous in the coming months and years.

Before that happens, legal scholar Andrew Ferguson would like to see more of a public discussion about the benefits and potential harms. For one thing, the large language models behind AI chatbots are prone to making up false information, a problem known as hallucination that could add convincing and hard-to-notice falsehoods into a police report.

“I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing,” said Ferguson, a law professor at American University working on what’s expected to be the first law review article on the emerging technology.

Ferguson said a police report is important in determining whether an officer’s suspicion “justifies someone’s loss of liberty.” It is sometimes the only testimony a judge sees, especially for misdemeanor crimes.

Human-generated police reports also have flaws, Ferguson said, but it’s an open question as to which is more reliable.

For some officers who’ve tried it, it is already changing how they respond to a reported crime. They’re narrating what’s happening so the camera better captures what they’d want to put in writing.

As the technology catches on, Bussert expects officers will become “more and more verbal” in describing what’s in front of them.

After Bussert loaded the video of a traffic stop into the system and pressed a button, the program produced a narrative-style report in conversational language that included dates and times, just like an officer would have typed from his notes, all based on audio from the body camera.

“It was literally seconds,” Gilmore said, “and it was done to the point where I was like, ‘I don’t have anything to change.'”

At the end of the report, the officer must click a box that indicates it was generated with the use of AI."


The entire story can  be read at:


https://www.milwaukeeindependent.com/newswire/inaccurate-automation-legal-concerns-surround-police-usage-ai-chatbots-write-crime-reports/



PUBLISHER'S NOTE:  I am monitoring this case/issue/resource. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog.

SEE BREAKDOWN OF  SOME OF THE ON-GOING INTERNATIONAL CASES (OUTSIDE OF THE CONTINENTAL USA) THAT I AM FOLLOWING ON THIS BLOG,  AT THE LINK BELOW:  HL:


https://www.blogger.com/blog/post/edit/120008354894645705/4704913685758792985


———————————————————————————————

FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."

Lawyer Radha Natarajan:

Executive Director: New England Innocence Project;


—————————————————————————————————


FINAL, FINAL WORD: "Since its inception, the Innocence Project has pushed the criminal legal system to confront and correct the laws and policies that cause and contribute to wrongful convictions.   They never shied away from the hard cases — the ones involving eyewitness identifications, confessions, and bite marks. Instead, in the course of presenting scientific evidence of innocence, they've exposed the unreliability of evidence that was, for centuries, deemed untouchable." So true!


Christina Swarns: Executive Director: The Innocence Project;


-------------------------------------------------------------------