Tuesday, August 8, 2023

Porcha Woodruff: Detroit, Michigan: Eight months pregnant and arrested after a false facial recognition match, New York Times Reporter Kashmir Hill describes how Ms. Woodruff thought the police who showed up at her door to arrest her for carjacking were joking - and became the first woman known to be wrongfully accused as a result of facial recognition technology…"The ordeal started with an automated facial recognition search, according to an investigator’s report from the Detroit Police Department. Ms. Woodruff is the sixth person to report being falsely accused of a crime as a result of facial recognition technology used by police to match an unknown offender’s face to a photo in a database. All six people have been Black; Ms. Woodruff is the first woman to report it happening to her. It is the third case involving the Detroit Police Department, which runs, on average, 125 facial recognition searches a year, almost entirely on Black men, according to weekly reports about the technology’s use provided by the police to Detroit’s Board of Police Commissioners, a civilian oversight group. Critics of the technology say the cases expose its weaknesses and the dangers posed to innocent people. The Detroit Police Department “is an agency that has every reason to know of the risks that using face recognition carries,” said Clare Garvie, an expert on the technology at the National Association of Criminal Defense Lawyers. “And it’s happening anyway.” On Thursday, Ms. Woodruff filed a lawsuit for wrongful arrest against the city of Detroit in U.S. District Court for the Eastern District of Michigan."

PUBLISHER'S NOTE: This Blog is interested in  false eye-witness identification issues because  wrongful identifications are at the heart of so many DNA-related exonerations in the USA and elsewhere - and because so much scientific research is being conducted with a goal to making the identification process more   transparent and reliable- and less subject to deliberate manipulation.  I have also reported far too many cases over the years - mainly cases lacking DNA evidence (or other forensic evidence pointing to the suspect - where the identification is erroneous - in spite of witness’s certainty that it is true - or where  the police pressure the witness, or rig the identification process in order to make a desired  identification inevitable. 

Harold Levy: Publisher: The Charles Smith Blog.

------------------------------------------------------------------

PASSAGE OF THE DAY: "Gary Wells, a psychology professor who has studied the reliability of eyewitness identifications, said pairing facial recognition technology with an eyewitness identification should not be the basis for charging someone with a crime. cEven if that similar-looking person is innocent, an eyewitness who is asked to make the same comparison is likely to repeat the mistake made by the computer. “It is circular and dangerous,” Dr. Wells said. “You’ve got a very powerful tool that, if it searches enough faces, will always yield people who look like the person on the surveillance image.” Dr. Wells said the technology compounds an existing problem with eyewitnesses. “They assume when you show them a six-pack, the real person is there,” he said."

————————————————————————————————

PASSAGE TWO OF THE DAY:  "The city of Detroit faces three lawsuits for wrongful arrests based on the use of the technology. “Shoddy technology makes shoddy investigations, and police assurances that they will conduct serious investigations do not ring true,” said Phil Mayor, a senior staff attorney at the American Civil Liberties Union of Michigan. Mr. Mayor represents Robert Williams, a Detroit man who was arrested in January 2020 for shoplifting based on a faulty facial recognition match, for which the prosecutor’s office later apologizedIn his lawsuit, Mr. Williams is trying to get the city to agree to collect more evidence in cases involving automated face searches and to end what Mr. Mayor called the “facial recognition to line-up pipeline.” “This is an extremely dangerous practice that has led to multiple false arrests that we know of,” Mr. Mayor said."


———————————————————————————


PASSAGE THREE OF THE DAY: "Ms. Woodruff said she was stressed for the rest of her pregnancy. She had to go to the police station the next day to retrieve her phone, and appeared for court hearings twice by Zoom before the case was dismissed because of insufficient evidence. “It’s scary. I’m worried. Someone always looks like someone else,” said her attorney, Ivan L. Land. “Facial recognition is just an investigative tool. If you get a hit, do your job and go further. Knock on her door.” Ms. Woodruff said that she was embarrassed to be arrested in front of her neighbors and that her daughters were traumatized. They now tease her infant son that he was “in jail before he was even born.” The experience was all the more difficult because she was so far along in her pregnancy, but Ms. Woodruff said she feels lucky that she was. She thinks it convinced authorities that she did not commit the crime. The woman involved in the carjacking had not been visibly pregnant."


-------------------------------------------------------


STORY: "Eight Months Pregnant and Arrested After False Facial Recognition Match," by Reporter Kashmir Hill, published by The New York Times, on August 6, 2023. - The New York Times.  (Kashmir  Hill is a tech reporter and the author of “Your Face Belongs To Us: A Secretive Startup’s Quest To End Privacy As We Know It.” She writes about the unexpected and sometimes ominous ways technology is changing our lives.)

SUB-HEADING: "Porcha Woodruff thought the police who showed up at her door to arrest her for carjacking were joking. She is the first woman known to be wrongfully accused as a result of facial recognition technology.


PHOTO CAPTION: “Porcha Woodruff, 32, of Detroit, said she gestured at her stomach when the police arrived at her house to indicate how ill-equipped she was to commit a robbery and carjacking.”


GIST: "Porcha Woodruff was getting her two daughters ready for school when six police officers showed up at her door in Detroit. They asked her to step outside because she was under arrest for robbery and carjacking.


“Are you kidding?” she recalled saying to the officers. Ms. Woodruff, 32, said she gestured at her stomach to indicate how ill-equipped she was to commit such a crime: She was eight months pregnant.


Handcuffed in front of her home on a Thursday morning last February, leaving her crying children with her fiancé, Ms. Woodruff was taken to the Detroit Detention Center.


 She said she was held for 11 hours, questioned about a crime she said she had no knowledge of, and had her iPhone seized to be searched for evidence.


“I was having contractions in the holding cell. My back was sending me sharp pains. I was having spasms. I think I was probably having a panic attack,” said Ms. Woodruff, a licensed aesthetician and nursing school student. “I was hurting, sitting on those concrete benches.”


After being charged in court with robbery and carjacking, Ms. Woodruff was released that evening on a $100,000 personal bond. 


In an interview, she said she went straight to the hospital where she was diagnosed with dehydration and given two bags of intravenous fluids. A month later, the Wayne County prosecutor dismissed the case against her.


The ordeal started with an automated facial recognition search, according to an investigator’s report from the Detroit Police Department. 


Ms. Woodruff is the sixth person to report being falsely accused of a crime as a result of facial recognition technology used by police to match an unknown offender’s face to a photo in a database. 


All six people have been Black; Ms. Woodruff is the first woman to report it happening to her.


It is the third case involving the Detroit Police Department, which runs, on average, 125 facial recognition searches a year, almost entirely on Black men, according to weekly reports about the technology’s use provided by the police to Detroit’s Board of Police Commissioners, a civilian oversight group. 


Critics of the technology say the cases expose its weaknesses and the dangers posed to innocent people.


The Detroit Police Department “is an agency that has every reason to know of the risks that using face recognition carries,” said Clare Garvie, an expert on the technology at the National Association of Criminal Defense Lawyers. “And it’s happening anyway.”


On Thursday, Ms. Woodruff filed a lawsuit for wrongful arrest against the city of Detroit in U.S. District Court for the Eastern District of Michigan.


“I have reviewed the allegations contained in the lawsuit. They are very concerning,” Detroit’s police chief, James E. White, said in a statement in response to questions from The New York Times. “We are taking this matter very seriously, but  we cannot comment further at this time due to the need for additional investigation.”


The Wayne County prosecutor, Kym Worthy, considers the arrest warrant in Ms. Woodruff’s case to be “appropriate based upon the facts,” according to a statement issued by her office.


The Investigation

On a Sunday night two and a half weeks before police showed up at Ms. Woodruff’s door, a 25-year-old man called the Detroit police from a liquor store to report that he had been robbed at gunpoint, according to a police report included in Ms. Woodruff’s lawsuit.


The robbery victim told the police that he had picked up a woman on the street earlier in the day. 


He said that they had been drinking together in his car, first in a liquor store parking lot, where they engaged in sexual intercourse, and then at a BP gas station. 


When he dropped her off at a spot 10 minutes away, a man there to meet her produced a handgun, took the victim’s wallet and phone, and fled in the victim’s Chevy Malibu, according to the police report.





Days later, the police arrested a man driving the stolen vehicle. A woman who matched the description given by the victim dropped off his phone at the same BP gas station, the police report said.


A detective with the police department’s commercial auto theft unit got the surveillance video from the BP gas station, the police report said, and asked a crime analyst at the department to run a facial recognition search on the woman.


According to city documents, the department uses a facial recognition vendor called DataWorks Plus to run unknown faces against a database of criminal mug shots; the system returns matches ranked by their likelihood of being the same person.


 A human analyst is ultimately responsible for deciding if any of the matches are a potential suspect. 


The police report said the crime analyst gave the investigator Ms. Woodruff’s name based on a match to a 2015 mug shot. Ms. Woodruff said in an interview that she had been arrested in 2015 after being pulled over while driving with an expired license.




Five days after the carjacking, the police report said, the detective assigned to the case asked the victim to look at the mug shots of six Black women, commonly called a “six-pack photo lineup.” Ms. Woodruff’s photo was among them. He identified Ms. Woodruff as the woman he had been with. That was the basis for her arrest, according to the police report. (The police did not say whether another woman has since been charged in the case.)


Gary Wells, a psychology professor who has studied the reliability of eyewitness identifications, said pairing facial recognition technology with an eyewitness identification should not be the basis for charging someone with a crime. 


Even if that similar-looking person is innocent, an eyewitness who is asked to make the same comparison is likely to repeat the mistake made by the computer.


“It is circular and dangerous,” Dr. Wells said. “You’ve got a very powerful tool that, if it searches enough faces, will always yield people who look like the person on the surveillance image.”


Dr. Wells said the technology compounds an existing problem with eyewitnesses. “They assume when you show them a six-pack, the real person is there,” he said.


Serious Consequences

The city of Detroit faces three lawsuits for wrongful arrests based on the use of the technology.

“Shoddy technology makes shoddy investigations, and police assurances that they will conduct serious investigations do not ring true,” said Phil Mayor, a senior staff attorney at the American Civil Liberties Union of Michigan.


Mr. Mayor represents Robert Williams, a Detroit man who was arrested in January 2020 for shoplifting based on a faulty facial recognition match, for which the prosecutor’s office later apologized.


In his lawsuit, Mr. Williams is trying to get the city to agree to collect more evidence in cases involving automated face searches and to end what Mr. Mayor called the “facial recognition to line-up pipeline.”


“This is an extremely dangerous practice that has led to multiple false arrests that we know of,” Mr. Mayor said.


The Toll

Ms. Woodruff said she was stressed for the rest of her pregnancy. She had to go to the police station the next day to retrieve her phone, and appeared for court hearings twice by Zoom before the case was dismissed because of insufficient evidence.


“It’s scary. I’m worried. Someone always looks like someone else,” said her attorney, Ivan L. Land. “Facial recognition is just an investigative tool. If you get a hit, do your job and go further. Knock on her door.”


Ms. Woodruff said that she was embarrassed to be arrested in front of her neighbors and that her daughters were traumatized. They now tease her infant son that he was “in jail before he was even born.”


The experience was all the more difficult because she was so far along in her pregnancy, but Ms. Woodruff said she feels lucky that she was. She thinks it convinced authorities that she did not commit the crime. The woman involved in the carjacking had not been visibly pregnant."


The entire story can be read at:

https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.html

PUBLISHER'S NOTE: I am monitoring this case/issue/resource. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. Harold Levy: Publisher: The Charles Smith Blog;

SEE BREAKDOWN OF SOME OF THE ON-GOING INTERNATIONAL CASES (OUTSIDE OF THE CONTINENTAL USA) THAT I AM FOLLOWING ON THIS BLOG, AT THE LINK BELOW: HL

https://www.blogger.com/blog/post/edit/120008354894645705/47049136857587929

FINAL WORD: (Applicable to all of our wrongful conviction cases): "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices.

Lawyer Radha Natarajan;

Executive Director: New England Innocence Project;

—————————————————————————————————


FINAL, FINAL WORD: "Since its inception, the Innocence Project has pushed the criminal legal system to confront and correct the laws and policies that cause and contribute to wrongful convictions. They never shied away from the hard cases — the ones involving eyewitness identifications, confessions, and bite marks. Instead, in the course of presenting scientific evidence of innocence, they've exposed the unreliability of evidence that was, for centuries, deemed untouchable." So true!


Christina Swarns: Executive Director: The Innocence Project;


------------------------------------------------------------------


YET ANOTHER FINAL WORD:


David Hammond, one of Broadwater’s attorneys who sought his exoneration, told the Syracuse Post-Standard, “Sprinkle some junk science onto a faulty identification, and it’s the perfect recipe for a wrongful conviction.”


https://deadline.com/2021/11/alice-sebold-lucky-rape-conviction-overturned-anthony-broadwater-1234880143/

------------------------------------------------------------