Saturday, July 30, 2022

Artificial Intelligence: Predictor of crime? U.S. Research yields seductive headlines - like this one in "New Scientist": "AI predicts crime a week with 90% accuracy."...but at least acknowledges that there are concerns how systems like this can perpetuate bias."..."Lawrence Sherman at the Cambridge Centre for Evidence-Based Policing, UK, says he is concerned about the inclusion of reactive and proactive policing data in the study, or crimes that tend to be recorded because people report them and crimes that tend to be recorded because police go out looking for them. The latter type of data is very susceptible to bias, he says. “It could be reflecting intentional discrimination by police in certain areas,” he says."


PASSAGE OF THE DAY: "Previous efforts to use AIs to predict crime have been controversial because they can perpetuate racial bias. In recent years, Chicago Police Department has trialled an algorithm that created a list of people deemed most at risk of being involved in a shooting, either as a victim or as a perpetrator. Details of the algorithm and the list were initially kept secret, but when the list was finally released, it turned out that 56 per cent of Black men in the city aged between 20 to 29 featured on it. Chattopadhyay concedes that the data used by his model will also be biased, but says that efforts have been taken to reduce the effect of bias and the AI doesn’t identify suspects, only potential sites of crime. “It’s not Minority Report,” he says."

----------------------------------------------------------

STORY: "AI predict crime  a week in advance with 90% accuracy," by  Reporter Matthew Sparkes, published by 'New Scientist'  on June 30, 2022.

SUB-HEADING: "An artificial intelligence that scours crime data can predict the location of crimes in the coming week with up to 90 per cent accuracy, but there are concerns how systems like this can perpetuate bias."

GIST: "An artificial intelligence can now predict the location and rate of crime across a city a week in advance with up to 90 per cent accuracy. Similar systems have been shown to perpetuate racist bias in policing, and the same could be true in this case, but the researchers who created this AI claim that it can also be used to expose those biases.

Ishanu Chattopadhyay at the University of Chicago and his colleagues created an AI model that analysed historical crime data from Chicago, Illinois, from 2014 to the end of 2016, then predicted crime levels for the weeks that followed this training period.

The model predicted the likelihood of certain crimes occurring across the city, which was divided into squares about 300 metres across, a week in advance with up to 90 per cent accuracy. It was also trained and tested on data for seven other major US cities, with a similar level of performance.

Previous efforts to use AIs to predict crime have been controversial because they can perpetuate racial bias. In recent years, Chicago Police Department has trialled an algorithm that created a list of people deemed most at risk of being involved in a shooting, either as a victim or as a perpetrator. Details of the algorithm and the list were initially kept secret, but when the list was finally released, it turned out that 56 per cent of Black men in the city aged between 20 to 29 featured on it.

Chattopadhyay concedes that the data used by his model will also be biased, but says that efforts have been taken to reduce the effect of bias and the AI doesn’t identify suspects, only potential sites of crime. “It’s not Minority Report,” he says.

“Law enforcement resources are not infinite. So you do want to use that optimally. It would be great if you could know where homicides are going to happen,” he says.

Chattopadhyay says the AI’s predictions could be more safely used to inform policy at a high level, rather than being used directly to allocate police resources. He has released the data and algorithm used in the study publicly so that other researchers can investigate the results.

The researchers also used the data to look for areas where human bias is affecting policing. They analysed the number of arrests following crimes in neighbourhoods in Chicago with different socioeconomic levels. This showed that crimes in wealthier areas resulted in more arrests than they did in poorer areas, suggesting bias in the police response.

Lawrence Sherman at the Cambridge Centre for Evidence-Based Policing, UK, says he is concerned about the inclusion of reactive and proactive policing data in the study, or crimes that tend to be recorded because people report them and crimes that tend to be recorded because police go out looking for them. The latter type of data is very susceptible to bias, he says. “It could be reflecting intentional discrimination by police in certain areas,” he says."

The entire story can be read at:

https://www.newscientist.com/article/2326297-ai-predicts-crime-a-week-in-advance-with-90-per-cent-accuracy/

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog;



SEE BREAKDOWN OF  SOME OF THE ON-GOING INTERNATIONAL CASES (OUTSIDE OF THE CONTINENTAL USA) THAT I AM FOLLOWING ON THIS BLOG,  AT THE LINK BELOW:  HL:




FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."
Lawyer Radha Natarajan:
Executive Director: New England Innocence Project;

—————————————————————————————————

FINAL, FINAL WORD: "Since its inception, the Innocence Project has pushed the criminal legal system to confront and correct the laws and policies that cause and contribute to wrongful convictions.   They never shied away from the hard cases — the ones involving eyewitness identifications, confessions, and bite marks. Instead, in the course of presenting scientific evidence of innocence, they've exposed the unreliability of evidence that was, for centuries, deemed untouchable." So true!
Christina Swarns: Executive Director: The Innocence Project;