Thursday, August 23, 2018
Technology series: Part 16: Significant Development: LAPD commits to review of controversial tech-based policing methods, Courthouse News reports. (Reporter Martin Macias Jr.)..."Goldsmith said the audit will review the effectiveness of the programs and determine what impact they’ve had specifically on communities of color in Los Angeles. Some of the largest law enforcement agencies in the country use so-called predictive policing programs to forecast where and when crime will occur in their communities. Departments also use crime data, gathered from algorithm-based and artificial intelligence-driven technologies, to determine which individuals are most likely to commit or recommit crimes. Those technologies, while seen by police as objective tools, have come under scrutiny by advocates. At the commissioners meeting Tuesday, Stop LAPD Spying Coalition member Hamid Kahn said the programs give police a “license” to profile based on race and anything short of dismantling the programs would be “smoke and mirrors.” The coalition’s May 8 report on the programs, “Before the Bullet Hits the Body,” found predictive policing makes it easier for police to justify stopping and searching people in the community."
PUBLISHER'S NOTE: Artificial intelligence, once the stuff of science fiction, has become all to real in our modern society - especially in the American criminal justice system; As the ACLU's Lee Rowland puts it: "Today, artificial intelligence. It's everywhere — in our homes, in our cars, our offices, and of course online. So maybe it should come as no surprise that government decisions are also being outsourced to computer code. In one Pennsylvania county, for example, child and family services uses digital tools to assess the likelihood that a child is at risk of abuse. Los Angeles contracts with the data giant Palantir to engage in predictive policing, in which algorithms identify residents who might commit future crimes. Local police departments are buying Amazon's facial recognition tool, which can automatically identify people as they go about their lives in public." The algorithm is finding its place deeper and deeper in the nation's courtrooms on what used to be exclusive decisions of judges such as bail and even the sentence to be imposed. I am pleased to see that a dialogue has begun on the effect that increasing use of these logarithms in our criminal justice systems is having on our society and on the quality of decision-making inside courtrooms. As Lee Rowland asks about this brave new world, "What does all this mean for our civil liberties and how do we exercise oversight of an algorithm?" In view of the importance of these issues - and the increasing use of artificial intelligence by countries for surveillance of their citizens - it's time for yet another technology series on The Charles Smith Blog focusing on the impact of science on society and criminal justice. Up to now I have been identifying the appearance of these technologies. Now at last I can report on the realization that some of them may be two-edged swords - and on growing pushback.
Harold Levy: Publisher; The Charles Smith Blog:
------------------------------------------------------------
PASSAGE OF THE DAY: "Predpol uses historical data from both property and violent crime reports to identify which city blocks are most likely to be the site of crimes. LASER plugs data from field interview cards collected by officers and information from arrest reports into a program called Palantir. The program then identifies “hotspots” where crime is likely to occur. Palantir mines government and private company databases to build extensive profiles of individuals then creates a list of people police have identified as “persons of interest.” The list is not available to the public."