Tuesday, September 13, 2022

Eyewitness Identification: machine ID versus human ID; Law prof. Valena Beety demonstrates how both crime-fighting tools can be flawed, and lead to wrongful convictions - particularly when they are employed together."..."The two techniques are equally susceptible to producing wrongful convictions, when police use either “without precautions,” according to Beatty. “Contextual information is vital to whether a factfinder correctly interprets either type of evidence,” Beety wrote in an article published in the Duquesne University Law Review, But a growing tendency to combine machine and human identification methods without careful precautions further increases the possibility of error, Beety argued in her essay, entitled “Considering ‘Machine Testimony’: The Impact of Facial Recognition Software on Eyewitness Identifications.”

PUBLISHER'S NOTE:This Blog is interested in  false eye-witness identification issues because  wrongful identifications are at the heart of so many DNA-related exonerations in the USA and elsewhere - and because so much scientific research is being conducted with a goal to making the identification process more   transparent and reliable- and less subject to deliberate manipulation.  I have also reported far too many cases over the years - mainly cases lacking DNA evidence (or other forensic evidence pointing to the suspect - where the identification is erroneous - in spite of witness’s certainty that it is true - or where  the police pressure the witness, or rig the identification process in order to make a desired  identification inevitable. 
-----------------------------------------------------------------

Harold Levy: Publisher: The Charles Smith Blog.PASSAGE OF THE DAY: "Eyewitness identification, in fact, remains a relatively unreliable identification method. Prosecutions may depend on eyewitness identification to secure a conviction, but the method lacks effective standards — the result of a spate of court cases that culminated in an uncritical legal embrace of eyewitness identification. Facial recognition software is also faulty, in ways comparable to eyewitness identification and unique in its software-specific danger. The technology — which compares two images and determines whether the same person is present in each image — relies on a photo-matching software with “fundamental accuracy problems.” Additionally, “the use of facial recognition software is not always disclosed to the person ultimately charged with the offense,” Beety wrote. “This failure to disclose can be problematic, given the known inaccuracy of facial recognition software when used to identify people of color.”


-----------------------------------------------------------


STORY: "Flaws of Eyewitness ID Magnified by Facial Recognition Software: Researcher,"  Reported by  Crime report contributor  

Gist: "As courts and cops have turned to facial recognition software to identify criminals, identification by machines is matching eyewitness identification by humans as a criminal-legal tactic.

But both crime-fighting tools are flawed, particularly when they are employed together, asserts Valena E. Beety, a law professor at Arizona State University’s Sandra Day O’Connor College of Law.

The two techniques are equally susceptible to producing wrongful convictions, when police use either “without precautions,” according to Beatty.

“Contextual information is vital to whether a factfinder correctly interprets either type of evidence,” Beety wrote in an article published in the Duquesne University Law Review,

But a growing tendency to combine machine and human identification methods without careful precautions further increases the possibility of error, Beety argued in her essay, entitled “Considering ‘Machine Testimony’: The Impact of Facial Recognition Software on Eyewitness Identifications.”

Recognizing that facial recognition software has a “cascading influence” on eyewitness identification, Beety suggested that professional associations, such as the Organization of Scientific Area Committees, include eyewitness identification in its review of facial recognition software.

Such foresight, Beety maintained, may produce a “more robust examination and consideration of  [the] software and its usage,” because the flaws of both methods are intertwined.

Eyewitness identification, in fact, remains a relatively unreliable identification method.

Prosecutions may depend on eyewitness identification to secure a conviction, but the method lacks effective standards — the result of a spate of court cases that culminated in an uncritical legal embrace of eyewitness identification.

Facial recognition software is also faulty, in ways comparable to eyewitness identification and unique in its software-specific danger. The technology — which compares two images and determines whether the same person is present in each image — relies on a photo-matching software with “fundamental accuracy problems.”

Additionally, “the use of facial recognition software is not always disclosed to the person ultimately charged with the offense,” Beety wrote.

“This failure to disclose can be problematic, given the known inaccuracy of facial recognition software when used to identify people of color.”

Research has consistently demonstrated that racial bias is embedded in certain machine-based algorithms, leading to wrongful convictions; eyewitness identifications is similarly marred by “cross-racial misidentification,” in which eyewitnesses struggle to identify people of a different race than their own.

Such flaws can have life-altering implications for people of color, because “white people have greater difficulty identifying people of color than vice versa,” Beety wrote.

“Police use of facial recognition software disproportionately affects Black Americans, Asian Americans, and Native Americans,” the article reads.

“While advocates of technology may claim these systems ‘do not see race,’ research now shows the incorrect identifications of people of color by these programs. Indeed, facial recognition is the least accurate of Black women, even misidentifying their gender.”

Cascading Influence

The cascading influence that facial recognition technology has on eyewitnesses — the placing of facial recognition photos in a traditional photo lineup, for example — calls for interconnected solutions.

Beety suggested, for example, that police departments implement “neutralizing procedures” for show-ups or line-ups, including facial recognition software findings.

“[The] National Academy of Sciences, [in its report] “Identifying the Culprit: Assessing Eyewitness Identification, recommended that law enforcement agencies implement protocols such as using double-blind lineup and photo array procedures, developing and using standardized witness instructions, documenting witness statements, and recording the witness identification,” Beety continued.

Ultimately, advocates of a more just criminal-legal system should remain attuned to the intersections between identification by machines and identification by humans; such awareness may precipitate badly-needed checks on both.

“By recognizing the connections between machine and human identifications, we can work to enhance the reliability of both,” Beety concludes.""

To read the full article, click here."

-------------------------------------------

https://thecrimereport.org/2022/09/07/can-facial-recognition-be-fixed/

PUBLISHER'S NOTE: I am monitoring this case/issue/resurce. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog;



SEE BREAKDOWN OF  SOME OF THE ON-GOING INTERNATIONAL CASES (OUTSIDE OF THE CONTINENTAL USA) THAT I AM FOLLOWING ON THIS BLOG,  AT THE LINK BELOW:  HL:




FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."
Lawyer Radha Natarajan:
Executive Director: New England Innocence Project;

—————————————————————————————————

FINAL, FINAL WORD: "Since its inception, the Innocence Project has pushed the criminal legal system to confront and correct the laws and policies that cause and contribute to wrongful convictions.   They never shied away from the hard cases — the ones involving eyewitness identifications, confessions, and bite marks. Instead, in the course of presenting scientific evidence of innocence, they've exposed the unreliability of evidence that was, for centuries, deemed untouchable." So true!
Christina Swarns: Executive Director: The Innocence Project;