Monday, May 13, 2019

Technology Series (Part One): Use of futuristic technologies in policing: Do they work better in the movies? Steven Greenhut argues (convincingly) in Reason that "Futuristic Tech-Driven Policing Will Only Be as Good as the Cops Doing It."..."Apparently, these technologies work better in the movies. It reminds me of the state attorney general's APPS program (Armed Prohibited Persons System), which sends agents to the homes of people who are no longer are deemed eligible to own firearms. It sounds like a great way to remove guns from "bad guys," until one realizes that the complex computer-generated lists are woefully inaccurate, according to some reports. Our government cannot get its current databases right, so how could we expect it to predict the future? The fundamental problem with these gee-whiz policing policies is not solely the technology, but "the lack of transparency and public accountability in deploying crime-targeting tools that could so easily be misused to oppress rather than protect neighborhoods," as the Times opined."


PASSAGE OF THE DAY: "The fundamental problem with these gee-whiz policing policies is not solely the technology, but "the lack of transparency and public accountability in deploying crime-targeting tools that could so easily be misused to oppress rather than protect neighborhoods," as the Times opined. I'd take it further. California's law-enforcement agencies are such bastions of secrecy that it makes it hard for the public to trust them as they head off in some innovative directions. New technologies that pinpoint crime problems and create gang profiles could potentially be useful. They might be better than policies such as gang injunctions that basically case a wide net over entire neighborhoods and restrict the rights of the innocent along with the guilty. But can we trust them?"

-------------------------------------------------------------

PUBLISHER'S NOTE: "In recent years, I have found myself publishing more and more posts on the  application of artificial intelligence technology to policing, public safety, and the criminal justice process,  not just in North America, but in countries all over the world, including China. Although I accept that properly applied science  can play a positive role in our society, I have learned over the years that technologies introduced for the so-called public good, can eventually be used against the people they were supposed to  benefit. As reporter Sieeka Khan  writes in Science Times:  "In 2017, researchers sent a letter to the secretary of the US Department of Homeland Security. The researchers expressed their concerns about a proposal to use the AI to determine whether someone who is seeking refuge in the US would become a positive and contributing member of society or if they are likely to become a threat or a terrorist. The other government uses of AI are also being questioned, such as the attempts at setting bail amounts and sentences on criminals, predictive policing and hiring government workers. All of these attempts have been shown to be prone to technical issues and a limit on the data can cause bias on their decisions as they will base it on gender, race or cultural background. Other AI technologies like automated surveillance, facial recognition and mass data collection are raising concerns about privacy, security, accuracy and fairness in a democratic society. As the executive order of Trump demonstrates, there is a massive interest in harnessing AI for its full, positive potential. But the dangers of misuse, bias and abuse, whether it is intentional or not, have the chance to work against the principles of international democracies. As the use of artificial intelligence grows, the potential for misuse, bias and abuse grows as well." The purpose of this 'technology' series, is to highlight the dangers of artificial intelligence  in the criminal justice system -  and to help readers make their own assessments as to  whether these innovations will do more harm than good.

Harold Levy: Publisher: The Charles Smith Blog.

----------------------------------------------------------

POST: : "Futuristic Tech-Driven Policing Will Only Be as Good as the Cops Doing It," by Steven Greenhut, published by Reason on March 29, 2019.  (This column was first published in the Orange County Register. Steven Greenhut is Western region director for the R Street Institute. He was a Register editorial writer from 1998-2009.)

SUB-HEADING: "One doesn't need a predictive-policing program to realize that police officers who have been convicted of serious crimes ought not to be trusted with a badge."

GIST: "In Washington, D.C. in 2054, a Department of PreCrime determines who is going to commit a crime before it happens. The government uses three mutants, known as "precogs," who have precise visions of future events. Police are sent in advance to arrest the not-quite-criminals and, voila, the crime rate drops to zero. That is the backdrop of the movie, "Minority Report," based on a story from the late Orange County sci-fi writer Philip Dick. Dystopian stories take real-life trends and extrapolate them far into the future, as a way to explore the moral conundrums of current policies. Flash forward 17 years from the movie's release (or back 35 years from the future!), and we find the Los Angeles Police Department wanting to impose its own version of what is known as "predictive policing." Instead of mutants, LAPD uses computers and human analysts. The department pinpointed high-crime LASER zones—Los Angeles Strategic Extraction and Restoration—and tried to determine where to deploy a greater police presence. That sounded OK, but the computer system also created a profile of actual people who might have a propensity to commit crimes based on data about gang membership and arrest records. That's startling. The inspector general found that "44 percent of chronic offenders had either zero or one arrest for violent crimes" and "about half had no arrest for gun-related crimes," according to a Los Angeles Times report, which noted that LAPD ultimately suspended the tool. Apparently, these technologies work better in the movies. It reminds me of the state attorney general's APPS program (Armed Prohibited Persons System), which sends agents to the homes of people who are no longer are deemed eligible to own firearms. It sounds like a great way to remove guns from "bad guys," until one realizes that the complex computer-generated lists are woefully inaccurate, according to some reports. Our government cannot get its current databases right, so how could we expect it to predict the future? The fundamental problem with these gee-whiz policing policies is not solely the technology, but "the lack of transparency and public accountability in deploying crime-targeting tools that could so easily be misused to oppress rather than protect neighborhoods," as the Times opined. I'd take it further. California's law-enforcement agencies are such bastions of secrecy that it makes it hard for the public to trust them as they head off in some innovative directions. New technologies that pinpoint crime problems and create gang profiles could potentially be useful. They might be better than policies such as gang injunctions that basically case a wide net over entire neighborhoods and restrict the rights of the innocent along with the guilty. But can we trust them? The fundamental problem with these gee-whiz policing policies is not solely the technology, but "the lack of transparency and public accountability in deploying crime-targeting tools that could so easily be misused to oppress rather than protect neighborhoods," as the Times opined. I'd take it further. California's law-enforcement agencies are such bastions of secrecy that it makes it hard for the public to trust them as they head off in some innovative directions. New technologies that pinpoint crime problems and create gang profiles could potentially be useful. They might be better than policies such as gang injunctions that basically case a wide net over entire neighborhoods and restrict the rights of the innocent along with the guilty. But can we trust them? As a Wired article from last year explained, most departments nationwide use a wide range of Automatic License Plate readers on buildings, signs and patrol cars, thus collecting massive amounts of surveillance data about citizens' every public move. That has helped solve crimes, but police agencies are amazingly secretive about how many cameras are out there, how they uThe fundamental problem with these gee-whiz policing policies is not solely the technology, but "the lack of transparency and public accountability in deploying crime-targeting tools that could so easily be misused to oppress rather than protect neighborhoods," as the Times opined. I'd take it further. California's law-enforcement agencies are such bastions of secrecy that it makes it hard for the public to trust them as they head off in some innovative directions. New technologies that pinpoint crime problems and create gang profiles could potentially be useful. They might be better than policies such as gang injunctions that basically case a wide net over entire neighborhoods and restrict the rights of the innocent along with the guilty. But can we trust them?  se the data and whether any particular tracking might violate a person's constitutional rights. Wired noted that "Officers misusing law enforcement databases for their own purposes is a perennial problem at the LAPD and elsewhere." That touches on the "trust" problem that is at the heart of any of these controversial systems. Police departments employ thousands of people. Most officers behave honorably, but it's hard to feel confident that police and sheriff's departments, and district attorneys for that matter, do much about those who misuse their vast powers. As always, "sunshine" is the key to accountability, and law enforcement tends to cloak its activities in secrecy. For instance, this newspaper group is now part of a broad media coalition that will work collaboratively to publish stories about police misconduct by filing a large number of public-information requests under a new law that makes it easier to access such records. Police unions, and some government agencies, have fought the legal release of these important public records under that law. The public would have less reason to be concerned about new surveillance technologies if police agencies were less resistant about conforming to open-government laws. Consider that California's police-union-friendly Attorney General Xavier Becerra is, as this newspaper explained recently, "threatening legal action against reporters with UC Berkeley's Investigative Reporting Program after they properly obtained spreadsheets with names of police officers, former officers and applicants for policing jobs who have been found guilty of misdeeds including child molestation, bribery and drug trafficking." One doesn't need a predictive-policing program—or employ a group of mutant savants—to realize that police officers who have been convicted of serious crimes ought not to be trusted with a badge. If California's police agencies want broader public support as they develop futuristic policing techniques, they ought to do a better job earning our trust with the basics."

 The entire story can be  read at:
 https://reason.com/2019/03/29/futuristic-tech-driven-policing-will-onl/