Tuesday, February 27, 2024

Technology: Artificial intelligence: The Innocence Project (Alyxaundria Sanford) warns that artificial intelligence is putting innocent people at risk of being incarcerated noting that there are at least seven confirmed cases of misidentification due to facial recognition technology, six of which involve Black people who have been wrongfully accused in the U.S…“The technology that was just supposed to be for investigation is now being proffered at trial as direct evidence of guilt.” Chris Fabricant;


QUOTE OF THE DAY: "“Corporations are making claims about the abilities of these techniques that are only supported by self-funded literature,” said Mr. Fabricant. “Politicians and law enforcement that spend [a lot of money] acquiring them, then are encouraged to tout their efficacy and the use of this technology.” 

Innocence Project’s director of strategic litigation and author of Junk Science and the American Criminal Justice System.

-------------------------------------------------------

PASSAGE OF THE DAY: "What is particularly worrying is that the adoption and use of AI, such as FRT, by law enforcement echoes previous examples of the misapplication of forensic science including bite mark analysis, hair comparisons, and arson investigation that have led to numerous wrongful convictions."

POST: "AI and The Risk of Wrongful Convictions in the U.S. by Alyxaundria Sanford, on February 14, 2024.

SUB-HEADING: "Artificial Intelligence Is Putting Innocent People at Risk of Being Incarcerated: There are at least seven confirmed cases of misidentification due to facial recognition technology, six of which involve Black people who have been wrongfully accused.


———————————————————————————


GIST: "Robert Williams thought it was a prank call that his wife received saying he needed to turn himself in to the police. But when the Michigan resident pulled into his driveway, the Detroit police officer who had been waiting outside Mr. Williams’ home pulled up behind him, got out of the car and placed him under arrest. He was detained for 30 hours.


Mr. Williams’ encounter with the police that day in January 2020 was the first documented case of wrongful arrest due to the use of facial recognition technology (FRT). He was accused of stealing thousands of dollars worth of Shinola watches. Grainy surveillance footage provided to law enforcement was run through facial recognition software and matched to an expired driver’s license photo of Mr. Williams.

There are at least seven confirmed cases of misidentification due to the use of facial recognition technology, six of which involve Black people who have been wrongfully accused: Nijeer Parks, Porcha Woodruff, Michael Oliver, Randall Reid, Alonzo Sawyer, and Robert Williams.

There has been concern that FRT and other artificial intelligence (AI) technologies will exacerbate racial inequities in policing and the criminal legal system. Research shows that facial recognition software is significantly less reliable for people of color, especially Black and Asian people, as algorithms struggle to distinguish facial features and darker skin tones. Another study concluded that disproportionate arrests of Black people by law enforcement agencies using FRT may be the result of  “the lack of Black faces in the algorithms’ training data sets, a belief that these programs are infallible and a tendency of officers’ own biases to magnify these issues.” 


“The technology that was just supposed to be for investigation is now being proffered at trial as direct evidence of guilt.”

Chris Fabricant


What is particularly worrying is that the adoption and use of AI, such as FRT, by law enforcement echoes previous examples of the misapplication of forensic science including bite mark analysis, hair comparisons, and arson investigation that have led to numerous wrongful convictions.

“The technology that was just supposed to be for investigation is now being proffered at trial as direct evidence of guilt. Often without ever having been subject to any kind of scrutiny,” said Chris Fabricant, Innocence Project’s director of strategic litigation and author of Junk Science and the American Criminal Justice System.

“Corporations are making claims about the abilities of these techniques that are only supported by self-funded literature,” said Mr. Fabricant. “Politicians and law enforcement that spend [a lot of money] acquiring them, then are encouraged to tout their efficacy and the use of this technology.” 


DNA has been essential in proving the innocence of people who have been wrongfully convicted for decades as a result of faulty forensic methods. Indeed, half of all DNA exonerations were the result of false or misleading forensic evidence. And of the 375 DNA exonerations in the United States between 1989 and 2020, 60% of the people freed were Black, according to Innocence Project data.

Still, not all cases lend themselves to DNA exonerations, especially those that involve police use of AI. For this reason, the Innocence Project is proactively pursuing pretrial litigation and policy advocacy to prevent the use of unreliable AI technology, and the misuse of even potentially reliable AI technology, before the damage is done.


“Many of these cases …  are just as susceptible to the same risks and factors that we’ve seen produce wrongful convictions in the past,” said Mitha Nandagopalan, a staff attorney in Innocence Project’s strategic litigation department. 

Mx. Nandagopalan is leading the Innocence Project’s latest endeavor to counter the potentially damaging effects of AI in policing, particularly in communities of color. 

“What is often seen in poorer neighborhoods or primarily [communities of color] is surveillance that is imposed by state and municipal actors, sometimes in line with the wishes of that community and sometimes not. It’s the people who live there that are the target in their own neighborhoods,” said Mx. Nandagopalan.

“In wealthier neighborhoods, whiter neighborhoods, I think you often see surveillance that is being bought and invited by residents.” These technologies include Ring doorbell cameras and homeowners’ associations contracting license plate reader companies such as FLOCK Safety.

The Neighborhood Project is a multidisciplinary effort to understand how surveillance technologies, like FRT, impact a community and may contribute to wrongful convictions. The project will focus on a particular location and partner with community members and local organizations to challenge and prevent the use of unreliable and untested technologies.

“Ultimately, what we want is for the people who would be most impacted by the surveillance technology to have a say in whether, and how it’s getting used in their communities,” said Mx. Nandagopalan.

Last year, the Biden administration issued an executive order to set standards and manage the risk of AI including a standard to develop “tools, and tests to help ensure that AI systems are safe, secure, and trustworthy.” However, there are no federal policies currently in place to regulate the use of AI in policing.

In the meantime, there are ways for concerned community members to influence and encourage local leaders to regulate the use of these technologies by local law enforcement and other agencies. 

“These are great reasons to go to your local city council or town council meetings. That’s where these [tech presentations] happen. On the very local level, those representatives are the people who are voting whether to use tax dollars or public money to fund this stuff,” said Amanda Wallwin, one of the Innocence Project’s state policy advocates.

“If you can be there, if you’re in the room, you can make such a difference.”

Watch “Digital Dilemmas: Exploring the Intersection of Technology, Race, and Wrongful Conviction” from Just Data, the Innocence Project’s annual virtual gathering dedicated to promoting practical research to advance the innocence movement."

—————————————————

The entire story can be read at:

https://innocenceproject.org/artificial-intelligence-is-putting-innocent-people-at-risk-of-being-incarcerated


PUBLISHER'S NOTE:  I am monitoring this case/issue/resource. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog;


SEE BREAKDOWN OF  SOME OF THE ON-GOING INTERNATIONAL CASES (OUTSIDE OF THE CONTINENTAL USA) THAT I AM FOLLOWING ON THIS BLOG,  AT THE LINK BELOW:  HL:


https://www.blogger.com/blog/post/edit/120008354894645705/4704913685758792985


FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."

Lawyer Radha Natarajan:

Executive Director: New England Innocence Project;


—————————————————————————————————


FINAL, FINAL WORD: "Since its inception, the Innocence Project has pushed the criminal legal system to confront and correct the laws and policies that cause and contribute to wrongful convictions.   They never shied away from the hard cases — the ones involving eyewitness identifications, confessions, and bite marks. Instead, in the course of presenting scientific evidence of innocence, they've exposed the unreliability of evidence that was, for centuries, deemed untouchable." So true!

Christina Swarns: Executive Director: The Innocence Project;

---------------------------------------------------------


YET ANOTHER FINAL WORD:


David Hammond, one of Broadwater's attorneys who sought his exoneration, told the Syracuse Post-Standard, "Sprinkle some junk science onto a faulty identification, and it's the perfect recipe for a wrongful conviction.


https://deadline.com/2021/11/alice-sebold-lucky-rape-conviction-overturned-anthony-broadwater-12348801

————————————————————————————————