Saturday, June 27, 2020

Robert Julian-Borchak Williams: Michigan: Aftermath (2): Topic of the day: "When the police treat software like magic," by NYT scribe Shira Ovide...The arrest of a man for a crime he did not commit shows the dangers of facial recognition technology."


BACKGROUND: TECHNOLOGY: In the last several years I have been spending considerably more time than usual on applications of rapidly developing technology in the criminal justice process that could effect the  quality of the administration of justice - for better, or, most often, for worse. First, of course, predictive policing (AKA Predpol) made it’s interest, at its most extreme promising  the ability to identify a criminal act before it occurred. At it’s minimal level, it offered police a better sense of where certain crimes where occurring in the community being policed - knowledge that the seasoned beat officer had intuited through every day police work years earlier. Predpol has lost some of it’s lustre as police departments discovered that the expense of acquiring and using the technology was not justified. Then we entered a period where logarithms were become popular with judges for use on bail hearings and on sentencing, In my eyes, these judges were just passing the buck to the machine when they could have, and should have made their decisions  based on information they received in open court - not from Logarithm’s which were noxious for their secrecy, because the manufacturers did not want to reveal their trade secrets - even in a courtroom where an accused person’s liberty and reputation  were on the hook. of these logarithms on bail and sentence have come under attack in many jurisdictions for discriminating against minorities and are hopefully on the way out. Lastly. facial recognition technology has become a concern to this Blog  because of its prove ability to sweep up huge numbers of people, lead to wrongful arrests and prosecutions, and discriminate racially.  May we never forget that  a huge, extremely well-funded, powerful industry, often politically connected industry  is pushing for profit use of all these technologies in the criminal systems - and, hopefully, in the post George Floyd aftermath  will be more concerned with the welfare of the community than their bottom Line. 

---------------------------------

PASSAGE OF THE DAY: "Shira: What a mess up. How did this happen?
Kash: The police are supposed to use facial recognition identification only as an investigative lead. But instead, people treat facial recognition as a kind of magic. And that’s why you get a case where someone was arrested based on flawed software combined with inadequate police work.

But humans, not just computers, misidentify people in criminal cases.
Absolutely. Witness testimony is also very troubling. That has been a selling point for many facial recognition technologies.

----------------------------------

STORY: "When the police treat software like magic," by reporter Shira Ovide, published by The New York Times on June 25, 2020. (Shira Ovide writes the very interesting 'On Tech newsletter, a guide to how technology is reshaping our lives and world' in The New York Times.)

SUB-HEADING: "The arrest of a man for a crime he did not commit shows the dangers of facial recognition technology."

GIST: "A lot of technology is pretty dumb, but we think it’s smart. My colleague Kashmir Hill showed the human toll of this mistake. Her article detailed how Robert Julian-Borchak Williams, a black man in Michigan, was accused of shoplifting on the basis of flawed police work that relied on faulty facial recognition technology. The software showed Williams’s driver’s license photo among possible matches with the man in the surveillance images, leading to Williams’s arrest in a crime he didn’t commit. (In response to Kash’s article, prosecutors apologized for what happened to Williams and said he could have his case expunged.) Kash talked to me about how this happened, and what the arrest showed about the limits and accuracy of facial recognition technology.

Shira: What a mess up. How did this happen?
Kash: The police are supposed to use facial recognition identification only as an investigative lead. But instead, people treat facial recognition as a kind of magic. And that’s why you get a case where someone was arrested based on flawed software combined with inadequate police work.

But humans, not just computers, misidentify people in criminal cases.
Absolutely. Witness testimony is also very troubling. That has been a selling point for many facial recognition technologies.

Is the problem that the facial recognition technology is inaccurate?
That’s one problem. A federal study of facial recognition algorithms found them to be biased and to wrongly identify people of color at higher rates than white people. The study included the two algorithms used in the image search that led to Williams’s arrest.
Sometimes the algorithm is good and sometimes it’s bad, and there’s not always a great way to tell the difference. And there’s usually no requirement for vetting the technology from policymakers, the government or law enforcemen
What’s the broader problem?
Companies that sell facial recognition software say it doesn’t give a perfect “match.” It gives a score of how likely the facial images in databases match the one you search. The technology companies say none of this is probable cause for arrest. (At least, that’s how they talk about it with a reporter for The New York Times.) But on the ground, officers see an image of a suspect next to a photo of the likeliest match, and it seems like the correct answer. I have seen facial recognition work well with some high-quality close-up images. But usually, police officers have grainy videos or a sketch, and computers don’t work well in those cases.
It feels as if we know computers are flawed, but we still believe the answers they spit out?
I wrote about the owner of a Kansas farm who was harassed by law enforcement and random visitors because of a glitch in software that maps people’s locations from their internet addresses. People incorrectly thought the mapping software was flawless. Facial recognition has the same problem. People don’t drill down into the technology, and they don’t read the fine print about the inaccuracies..."

The entire story can be read at:
https://www.nytimes.com/2020/06/25/technology/facial-recognition-software-dangers.html

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog;
-----------------------------------------------------------------
FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."
Lawyer Radha Natarajan:
Executive Director: New England Innocence Project;
------------------------------------------------------------------