Monday, April 10, 2023

Facial Recognition: (Wrongful identification): Marshall Project sounds the alarm: "Facial recognition software is increasingly ubiquitous in modern life, and odds are good that you have at least one social media account or mobile device that uses a version of this technology. Just yesterday, my phone’s photos app sent me a slideshow with pictures of me and a close friend over the years, all selected by artificial intelligence and set to upbeat music. But the technology is rapidly expanding beyond novelty and into public life. Retailers are increasingly using it to monitor shoplifting. Madison Square Garden, New York City’s famous venue, has also recently come under scrutiny for using facial recognition to keep out lawyers involved in lawsuits against the arena. Police are using it too. Hoan Ton-That, CEO of facial recognition firm Clearview AI, recently told the BBC that U.S. police have completed more than 1 million photo searches on the company’s platform. Clearview’s technology pairs facial recognition algorithms (which many companies offer) with a database of over 30 billion photos scraped from the internet — mostly from social media — without the consent of those photographed. The firm primarily markets the tool to local law enforcement."


QUOTE OF THE DAY: "Last week, the New York Times published the story of Randal Quran Reid, who was pulled over in Georgia in November 2022, and arrested for stealing designer handbags in Louisiana. Reid had never been to Louisiana and was the apparent victim of a mistaken ID by Clearview’s technology. Nevertheless, he spent six days in jail, and thousands of dollars on legal fees before it was sorted out. “Imagine you’re living your life and somewhere far away says you committed a crime,” Reid told the Times. “And you know you’ve never been there.” Perhaps the most troubling detail in Reid’s case is that nowhere in the documents used to arrest him — including the warrant signed by a judge — does it state that facial recognition was used."

-------------------------------------------------------------

PASSAGE ONE OF THE DAY: "Historically, these algorithms have performed worse on darker-skinned people than on White people. Others have argued that the technology is rapidly improving, and that racial bias concerns may be overblown. Last year, Wired Magazine, which has closely followed the rise of policing technology, told the stories of three other men — all Black — who were wrongfully arrested after false IDs. For all three, the arrests caused serious financial and emotional burdens. “Once I got arrested and I lost my job, it was like everything fell, like everything went down the drain,” Michael Oliver told Wired. Two of the cases took place in Detroit, where police have subsequently raised the standards for using facial recognition technology in criminal investigations. Police officials, like then-Detroit Police Chief James Craig, typically defend its use by noting that officers are only supposed to rely on the software to generate leads in an investigation — and not to determine who they arrest. But as a recent report from the Georgetown Law Center on Privacy and Technology noted in reference to this technology: “In the absence of case law or other guidance, it has in some cases been the primary, if not the only, piece of evidence linking an individual to the crime.” False identification in the justice system is a problem that long predates facial recognition technology. Just last month, Florida man Sidney Holmes — who spent more than 30 years behind bars — was exonerated after prosecutors determined that he was likely misidentified by an eyewitness in a lineup.According to the legal aid organization Innocence Project, eyewitness misidentification is the “leading factor” in wrongful convictions. As a 2022 study published in the Duquesne Law Review noted: “Wrongful convictions can occur when police use either of these identification methods without precautions.”

-----------------------------------------------------------

PASSAGE ONE OF THE DAY: "In at least one case, the technology has helped prove the innocence of someone accused of a crime. That case, chronicled in detail by The New York Times, led to Clearview making its product available to public defenders, in order to “balance the scales of justice,” as the company’s Ton-That put it."

---------------------------------------------------------

PASSAGE TWO OF THE DAY: "Historically, these algorithms have performed worse on darker-skinned people than on White people. Others have argued that the technology is rapidly improving, and that racial bias concerns may be overblown. Last year, Wired Magazine, which has closely followed the rise of policing technology, told the stories of three other men — all Black — who were wrongfully arrested after false IDs. For all three, the arrests caused serious financial and emotional burdens. “Once I got arrested and I lost my job, it was like everything fell, like everything went down the drain,” Michael Oliver told Wired. Two of the cases took place in Detroit, where police have subsequently raised the standards for using facial recognition technology in criminal investigations. Police officials, like then-Detroit Police Chief James Craig, typically defend its use by noting that officers are only supposed to rely on the software to generate leads in an investigation — and not to determine who they arrest. But as a recent report from the Georgetown Law Center on Privacy and Technology noted in reference to this technology: “In the absence of case law or other guidance, it has in some cases been the primary, if not the only, piece of evidence linking an individual to the crime.” False identification in the justice system is a problem that long predates facial recognition technology. Just last month, Florida man Sidney Holmes — who spent more than 30 years behind bars — was exonerated after prosecutors determined that he was likely misidentified by an eyewitness in a lineup.According to the legal aid organization Innocence Project, eyewitness misidentification is the “leading factor” in wrongful convictions. As a 2022 study published in the Duquesne Law Review noted: “Wrongful convictions can occur when police use either of these identification methods without precautions.”

-------------------------------------------------------------------

POST: "Facial recognition software is increasingly ubiquitous in modern life, and odds are good that you have at least one social media account or mobile device that uses a version of this technology. Just yesterday, my phone’s photos app sent me a slideshow with pictures of me and a close friend over the years, all selected by artificial intelligence and set to upbeat music.

But the technology is rapidly expanding beyond novelty and into public life. Retailers are increasingly using it to monitor shoplifting. Madison Square Garden, New York City’s famous venue, has also recently come under scrutiny for using facial recognition to keep out lawyers involved in lawsuits against the arena.

Police are using it too. Hoan Ton-That, CEO of facial recognition firm Clearview AI, recently told the BBC that U.S. police have completed more than 1 million photo searches on the company’s platform.

Clearview’s technology pairs facial recognition algorithms (which many companies offer) with a database of over 30 billion photos scraped from the internet — mostly from social media — without the consent of those photographed. The firm primarily markets the tool to local law enforcement.

Last week, the New York Times published the story of Randal Quran Reid, who was pulled over in Georgia in November 2022, and arrested for stealing designer handbags in Louisiana. Reid had never been to Louisiana and was the apparent victim of a mistaken ID by Clearview’s technology. Nevertheless, he spent six days in jail, and thousands of dollars on legal fees before it was sorted out.

“Imagine you’re living your life and somewhere far away says you committed a crime,” Reid told the Times. “And you know you’ve never been there.”

Perhaps the most troubling detail in Reid’s case is that nowhere in the documents used to arrest him — including the warrant signed by a judge — does it state that facial recognition was used.

Historically, these algorithms have performed worse on darker-skinned people than on White people. Others have argued that the technology is rapidly improving, and that racial bias concerns may be overblown.

Last year, Wired Magazine, which has closely followed the rise of policing technology, told the stories of three other men — all Black — who were wrongfully arrested after false IDs. For all three, the arrests caused serious financial and emotional burdens. “Once I got arrested and I lost my job, it was like everything fell, like everything went down the drain,” Michael Oliver told Wired.

Two of the cases took place in Detroit, where police have subsequently raised the standards for using facial recognition technology in criminal investigations. Police officials, like then-Detroit Police Chief James Craig, typically defend its use by noting that officers are only supposed to rely on the software to generate leads in an investigation — and not to determine who they arrest. 

But as a recent report from the Georgetown Law Center on Privacy and Technology noted in reference to this technology: “In the absence of case law or other guidance, it has in some cases been the primary, if not the only, piece of evidence linking an individual to the crime.”

False identification in the justice system is a problem that long predates facial recognition technology. Just last month, Florida man Sidney Holmes — who spent more than 30 years behind bars — was exonerated after prosecutors determined that he was likely misidentified by an eyewitness in a lineup.According to the legal aid organization Innocence Project, eyewitness misidentification is the “leading factor” in wrongful convictions.

As a 2022 study published in the Duquesne Law Review noted: “Wrongful convictions can occur when police use either of these identification methods without precautions.”

In at least one case, the technology has helped prove the innocence of someone accused of a crime. That case, chronicled in detail by The New York Times, led to Clearview making its product available to public defenders, in order to “balance the scales of justice,” as the company’s Ton-That put it.

But critics said they are highly skeptical that the technology will ever have as much power to undo bad arrests as it does to generate them.

Internationally, the European UnionAustralia, and Canada have all determined that Clearview’s technology violates their privacy laws. The company also had to agree not to sell its database to other companies under the terms of a 2022 settlement with Illinois for violating state privacy laws.

Meanwhile, in Russia and China, facial recognition has increasingly become a key instrument of state control. The U.S. government has also been engaged in its own research around the technology for law enforcement purposes, including the FBI and Immigration and Customs Enforcement. That includes a new app that migrants seeking asylum are now required to use to track their requests.

A 2021 government watchdog report found that the “use of facial recognition technology is widespread throughout the federal government, and many agencies do not even know which systems they are using,” according to The Washington Post. There have been congressional hearings and proposed legislation on the question since that time, but no concrete changes."

The entire post can be read at:

https://mail.google.com/mail/u/0/#inbox/FMfcgzGsltQNGGFMCxmNZlnjjWhzJVjt

PUBLISHER'S NOTE: I am monitoring this case/issue/resource. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. Harold Levy: Publisher: The Charles Smith Blog;

SEE BREAKDOWN OF SOME OF THE ON-GOING INTERNATIONAL CASES (OUTSIDE OF THE CONTINENTAL USA) THAT I AM FOLLOWING ON THIS BLOG, AT THE LINK BELOW: HL:


https://www.blogger.com/blog/post/edit/120008354894645705/4704913685758792985


FINAL WORD: (Applicable to all of our wrongful conviction cases): "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."


Lawyer Radha Natarajan:


Executive Director: New England Innocence Project;

—————————————————————————————————


FINAL, FINAL WORD: "Since its inception, the Innocence Project has pushed the criminal legal system to confront and correct the laws and policies that cause and contribute to wrongful convictions. They never shied away from the hard cases — the ones involving eyewitness identifications, confessions, and bite marks. Instead, in the course of presenting scientific evidence of innocence, they've exposed the unreliability of evidence that was, for centuries, deemed untouchable." So true!


Christina Swarns: Executive Director: The Innocence Project;


------------------------------------------------------------------


YET ANOTHER FINAL WORD:


David Hammond, one of Broadwater’s attorneys who sought his exoneration, told the Syracuse Post-Standard, “Sprinkle some junk science onto a faulty identification, and it’s the perfect recipe for a wrongful conviction.”


https://deadline.com/2021/11/alice-sebold-lucky-rape-conviction-overturned-anthony-broadwater-1234880143/


-------------------------------------------------------------------