Tuesday, June 30, 2020

Technology: Who is the activist who has set out to dismantle racist police algorithms? The MIT Technology Review has the answer - and the interview with him by reporters Tate Ryan-Mosley and Jennifer Strong with him makes a very interesting read. HL.



BACKGROUND: TECHNOLOGY: In the last several years I have been spending considerably more time than usual on applications of rapidly developing technology in the criminal justice process that could effect the  quality of the administration of justice - for better, or, most often, for worse. First, of course, predictive policing (AKA Predpol) made it’s interest, at its most extreme promising  the ability to identify a criminal act before it occurred. At it’s minimal level, it offered police a better sense of where certain crimes where occurring in the community being policed - knowledge that the seasoned beat officer had intuited through every day police work years earlier. Predpol has lost some of it’s lustre as police departments discovered that the expense of acquiring and using the technology was not justified. Then we entered a period where logarithms were become popular with judges for use on bail hearings and on sentencing, In my eyes, these judges were just passing the buck to the machine when they could have, and should have made their decisions  based on information they received in open court - not from Logarithm’s which were noxious for their secrecy, because the manufacturers did not want to reveal their trade secrets - even in a courtroom where an accused person’s liberty and reputation  were on the hook. of these logarithms on bail and sentence have come under attack in many jurisdictions for discriminating against minorities and are hopefully on the way out. Lastly. facial recognition technology has become a concern to this Blog  because of its prove ability to sweep up huge numbers of people, lead to wrongful arrests and prosecutions, and discriminate racially.  May we never forget that  a huge, extremely well-funded, powerful industry, often politically connected industry  is pushing for profit use of all these technologies in the criminal systems - and, hopefully, in the post George Floyd aftermath  will be more concerned with the welfare of the community than their bottom Line. HL.

---------------------------------

INTERVIEW: "The activist dismantling racist police algorithms," by Reporters Tate Ryan-Mosley and Jennifer Strong, published by The MIT Technology Review on  June 5, 2020.

PREFACE: Hamid Khan has been a community organizer in Los Angeles for over 35 years, with a consistent focus on police violence and human rights. He talked to us on April 3, 2020, for a forthcoming podcast episode about artificial intelligence and policing. As the world turns its attention to police brutality and institutional racism, we thought our conversation with him about how he believes technology enables racism in policing should be published now.  
Khan is the founder of the Stop LAPD Spying Coalition, which has won many landmark court cases on behalf of the minority communities it fights for. Its work is perhaps best known for advocacy against predictive policing. On April 21, a few weeks after this interview, the LAPD announced an end to all predictive policing programs
Khan is a controversial figure who has turned down partnerships with groups like the Electronic Frontier Foundation (EFF) because of its emphasis on reform. He doesn’t believe reform will work. The interview has been edited for length and clarity. 

GIST: "Tell us about your work. Why do you care about police surveillance?

The work that we do, particularly looking at the Los Angeles Police Department, looks at how surveillance, information gathering, storing, and sharing has historically been used to really cause harm, to trace, track, monitor, stalk particular communities: communities who are poor, who are black and brown, communities who would be considered suspect, and queer trans bodies. So on various levels, surveillance is a process of social control. 

Do you believe there is a role for technology in policing?

The Stop LAPD Spying Coalition has a few guiding values. The first one is that what we are looking at is not a moment in time but a continuation of history. Surveillance has been used for hundreds of years. Some of the earliest surveillance processes go back to lantern laws in New York City in the early 1700s. If you were an enslaved person, a black or an indigenous person, and if you were walking out into the public area without your master, you had to walk with an actual literal lantern, with the candle wick and everything, to basically self-identify yourself as a suspect, as the “other.”
Another guiding value is that there’s always an “other.” Historically speaking, there’s always a “threat to the system.” There's always a body, an individual, or groups of people that are deemed dangerous. They are deemed suspect. 
The third value is that we are always looking to de-sensationalize the rhetoric of national security. To keep it very simple and straightforward, [we try to show] how the information-gathering and information-sharing environment moves and how it’s a process of keeping an eye on everybody.


“Algorithms have no place in policing.”

And one of our last guiding values is that our fight is rooted in human rights. We are fiercely an abolitionist group, so our goal is to dismantle the system. We don’t engage in reformist work. We also consider any policy development around transparency, accountability, and oversight a template for mission creep. Any time surveillance gets legitimized, then it is open to be expanded over time. Right now, we are fighting to keep the drones grounded in Los Angeles, and we were able to keep them grounded for a few years. And in late March, the Chula Vista Police Department in San Diego announced that they are going to equip their drones with loudspeakers to monitor the movement of unhoused people.

Can you explain the work the Stop LAPD Spying Coalition has been doing on predictive policing? What are the issues with it from your perspective?

PredPol was location-based predictive policing in which a 500-by-500-square-foot location was identified as a hot spot. The other companion program, Operation Laser, was person-based predictive policing.
In 2010, we looked at the various ways that these [LAPD surveillance] programs were being instituted. Predictive policing was a key program. We formally launched a campaign in 2016 to understand the impact of predictive policing in Los Angeles with the goal to dismantle the program, to bring this information to the community and to fight back.
Person-based predictive policing claimed that for individuals who are called “persons of interest” or “habitual offenders,” who may have had some history in the past, we could use a risk assessment tool to establish that they were going to recidivate. So it was a numbers game. If they had any gun possession in the past, they were assigned five points. If they were on parole or probation, they were assigned five points. If they were gang-affiliated, they were assigned five points. If they’d had interactions with the police like a stop-and-frisk, they would be assigned one point. And this became where individuals who were on parole or probation or minding their own business and rebuilding their lives were then placed in what became known as a Chronic Offender Program, unbeknownst to many people.





“So location gets criminalized, people get criminalized, and it’s only a few seconds away before the gun comes out and somebody gets shot and killed.”

Then, based on this risk assessment, where Palantir is processing all the data, the LAPD created a list. They  started releasing bulletins, which were like a Most Wanted poster with these individuals’ photos, addresses, and history as well, and put them in patrol cars. [They] started deploying license plate readers, the stingray, the IMSI-Catcher, CCTV, and various other tech to track their movements, and then creating conditions on the ground to stop and to harass and intimidate them. We built a lot of grassroots power, and in April 2019 Operation Laser was formally dismantled. It was discontinued.
And right now we are going after PredPol and demanding that PredPol be dismantled as well. [LAPD announced an end to PredPol on April 21, 2020.] Our goal for the abolition and dismantlement of this program is not just rooted in garbage in, garbage out; racist data in and racist data out. Our work is really rooted in that it ultimately serves the whole ideological framework of patriarchy and capitalism and white supremacy and settler colonialism.
We released a report, “Before the Bullet Hits the Body,” in May 2018 on predictive policing in Los Angeles, which led to the city of Los Angeles holding public hearings on data-driven policing, which were the first of their kind in the country. We demanded a forensic audit of PredPol by the inspector general. In March 2019, the inspector general released the audit and it said that we cannot even audit PredPol because it’s just not possible. It’s so, so complicated.
Algorithms have no place in policing. I think it’s crucial that we understand that there are lives at stake. This language of location-based policing is by itself a proxy for racism. They’re not there to police potholes and trees. They are there to police people in the location. So location gets criminalized, people get criminalized, and it’s only a few seconds away before the gun comes out and somebody gets shot and killed.


How do you ensure that the public understands these kinds of policing tactics? 

Public records are a really good tool to get information. What is the origin of this program? We want to know: What was the vision? How was it being articulated? What is the purpose for the funding? What is the vocabulary that they’re using? What are the outcomes that they’re presenting to the funder? 



“I’m a human, and I am not here that you just unpack me and just start experimenting on me and then package me.”

They [the LAPD] would deem an area, an apartment building, as hot spots and zones. And people were being stopped at a much faster pace [there]. Every time you stop somebody, that information goes into a database. It became a major data collection program. 
We demanded that they release the secret list that they had of these individuals. LAPD fought back, and we did win that public records lawsuit. So now we have a secret list of 679 individuals, which we’re now looking to reach out to. And these are all young individuals, predominantly about 90% to 95% black and brown. 
Redlining the area creates conditions on the ground for more development, more gentrification, more eviction, more displacement of people. So the police became protectors of private property and protectors of privilege.

What do you say to people who believe technology can help mitigate some of these issues in policing, such as biases, because technology can be objective? 

First of all, technology is not operating by itself.  From the design to the production to the deployment to the outcome, there is constantly bias built in. It’s not just the biases of the people themselves; it’s the inherent biaswithin the system
There’s so many points of influence that, quite frankly, our fight is not for cleaning up the data. Our fight is not for an unbiased algorithm, because we don’t believe that even mathematically, there could be an unbiased algorithm for policing at all.

What are the human rights considerations when it comes to police technology and surveillance?

The first human right would be to stop being experimented on. I’m a human, and I am not here that you just unpack me and just start experimenting on me and then package me. There’s so much datafication of our lives that has happened. From plantation capitalism to racialized capitalism to now surveillance capitalism as well, we are subject to being bought and sold. Our minds and our thoughts have been commodified. It has a dumbing-down effect as well on our creativity as human beings, as a part of a natural universe. Consent is being manufactured out of us.

With something like coronavirus, we certainly are seeing that some people are willing to give up some of their data and some of their privacy. What do you think about the choice or trade-off between utility and privacy? 

We have to really look at it through a much broader lens.  Going back to one of our guiding values: not a moment in time but a continuation of history. So we have to look at crises in the past, both real and concocted. 
Let's look at the 1984 Olympics in Los Angeles. That led to the most massive expansion of police powers and militarization of the Los Angeles Police Department and the sheriff’s department under the guise of public safety. The thing was “Well, we want to keep everything safe.” But not only [did] it become a permanent feature and the new normal, but tactics were developed as well. Because streets had to be cleaned up, suspect bodies, unhoused folks, were forcibly removed. Gang sweeps supposedly started happening. So young black and brown youth were being arrested en masse. This is like 1983, leading to 1984.
By 1986-1987 in Los Angeles, gang injunctions became a permanent feature. This resulted in massive gang databases, and children as young as nine months old going into these gang databases. That became Operation Hammer, where they had gotten tanks and armored vehicles, used by SWAT, for delivering low-level drug offenses, and going down and breaking down people’s homes.
Now we are again at a moment. It’s not just the structural expansion of police powers; we have to look at police now increasingly taking on roles as social workers.  It’s been building over the last 10 years. There’s a lot of health and human services dollars attached to that too. For example, in Los Angeles, the city controller came out with an audit about five years ago, and they looked at $100 million for homeless services that the city provides. Well, guess what? Out of that, $87 million was going to LAPD.  

Can you provide a specific example of how police use of technology is impacting community members?

Intelligence-led policing is a concept that comes out of England, out of the Kent Constabulary, and started about 30 years ago in the US. The central theme of intelligence-led policing is behavioral surveillance.  People’s behavior needs to be monitored, and then be processed, and that information needs to be shared. People need to be traced and tracked.  




“There is no such thing as kinder, gentler racism, and these programs have to be dismantled.

One program called Suspicious Activity Reporting came out of 9/11, in which several activities which are completely constitutionally protected are listed as potentially suspicious. For example, taking photographs in public, using video cameras in public, walking into infrastructure and asking about hours of operations. It’s observed behavior reasonably indicative of preoperational planning of criminal and/or terrorist activity. So you’re observing somebody’s behavior, which reasonably indicates there is no probable cause. It creates not a fact, but a concern. That speculative and hunch-based policing is real.  
We were able to get numbers from LAPD’s See Something, Say Something program. And what we found was that there was a 3:1 disparate impact on the black community. About 70% of these See Something, Say Something reports came from predominantly white communities in Los Angeles. So now a program is being weaponized and becomes a license to racially profile.
The goal is always to be building power toward abolition of these programs, because you can’t reform them. There is no such thing as kinder, gentler racism, and these programs have to be dismantled.

So, you really think that reform won’t allow for use of these technologies in policing?

I can only speak about my own history of 35 years of organizing in LA. It’s not a matter of getting better, it’s a matter of getting worse. And I think technology is furthering that. When you look at the history of reform, we keep on hitting our head against the wall, and it just keeps on coming back to the same old thing. We can’t really operate under the assumption that hearts and minds can change, particularly when somebody has a license to kill.
I’m not a technologist. Our caution is for the technologists: you know, stay in your lane. Follow the community and follow their guidance."

The entire story can be read at: 

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog;
-----------------------------------------------------------------
FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."
Lawyer Radha Natarajan:
Executive Director: New England Innocence Project;
------------------------------------------------------------------

Monday, June 29, 2020

Billy Joe Wardlow: Texas: Change.org petition sends message to Governor Greg Abbott as July 8 rapidly approaches..."We, the undersigned, believe that Billy Wardlow's death warrant should be rescinded, that he be granted a new sentencing trial where the mitigating factors are openly looked at with unbiased eyes, and that the execution of Billy Wardlow not go forward under any circumstances."


PUBLISHER'S NOTE: I just signed. Over to you.

Harold Levy: Publisher: The Charles Smith Blog.

-----------------------------------------------------------

BACKGROUND:  He received his death sentence in 1993 for killing an 82-year old man during a robbery when he was 18-years old. Here's what the jury was told about whether Wardlow would constitute a future danger to others in prison: The most chilling testimony for the state came from Royce Smithey, an investigator for a group that prosecutes felony crimes committed in Texas prisons. If the jury sentenced Wardlow to death, the investigator said, he would be “segregated” and “severely restricted” until he was executed. He would have limited access to prison employees whom he might harm. Solitary confinement on death row would punish Wardlow and protect prison employees from the continuing danger he represented, Smithey testified. But if the jury gave him a life sentence, he asserted, Wardlow would be released into the general prison population with other felony offenders. Recently, Frank G. Aubuchon, who was a correctional officer and an administrator with the Texas Department of Criminal Justice (TDCJ) for more than 26 years, reviewed Smithey’s testimony at the request of Wardlow’s current lawyers. Aubuchon wrote, “Mr. Smithey’s multiple falsehoods served to mislead the jury into believing that TDCJ would be completely unprepared to imprison Mr. Wardlow in a secure environment unless he received a death sentence. Based on my decades of experience as a TDCJ corrections officer, administrator, and prison classifications expert, I can say that this is categorically false.”

------------------------------------------------------------------------------

GIST: "Billy Wardlow, at this moment, has sat for up to 23 hours a day alone in a 6 x 8-foot cell with virtually no human contact or exposure to natural light and has been for 25 years. He is condemned to death on Texas Death Row awaiting his execution date which is scheduled for July 8th 2020. 
Billy has done the right thing and admitted his wrongdoing the night the crime was committed but should he be put to death? Absolutely NOT! Were you the same person you are today, a quarter of a century ago? Have you matured, grown and changed for the better? Billy has!
There are so many mitigating factors as to why Billy should not only be out already but never be put to death in the name of justice! He was overly prosecuted and the wrongs that the state has made against him need to come to light so his sentence can be finally commuted and real justice served.

Under Texas law, if there is enough mitigating evidence at any time to warrant a life sentence without parole, that’s the sentence is supposed to be given. This has not happened in Billy's case. At the trial, a chance of a life sentence was stolen from him due to many mistaken assumptions as well as false testimony.
We are here to ask that you become a voice for Billy. Help save his life! Please sign this petition to ask Governor Greg Abbott, Texas Board of Parole and Pardons as well as the Supreme Court of The United States, to carefully consider looking at Billy's case and resentencing him fairly. Do what is right by showing mercy by giving him his chance at LIFE.
To learn more about Billy's case please take a look at this wonderful article by The American Scholar. https://theamericanscholar.org/this-man-should-not-be-executed
We, the undersigned, believe that Billy Wardlow's death warrant should be rescinded, that he be granted a new sentencing trial where the mitigating factors are openly looked at with unbiased eyes, and that the execution of Billy Wardlow not go forward under any circumstances."
The petition can be accessed for signature at:
https://www.change.org/p/greg-abbott-save-billy-wardlow

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog;
-----------------------------------------------------------------
FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."
Lawyer Radha Natarajan:
Executive Director: New England Innocence Project;
------------------------------------------------------------------

Robert Xie: Australia: On-going appeal. (Monday): "Evidence related to a possible murder weapon shouldn't have been put before the jury at the trial of Robert Xie, who was convicted of bludgeoning five relatives to death, his appeal has been told. The trial judge erred in admitting "coincidence" evidence in relation to a folded, blood-soaked piece of cloth found at the crime scene and a "massage device" found at Xie's home, his barrister Tom Quilter said."


BACKGROUND: "On July 18, 2009, newsagent Min Lin, 45, Mr Lin’s wife Yun Li “Lily” Lin, 44, their sons Henry, 12, and Terry, 9, and Mrs Lin’s sister, Yun Bin “Irene” Lin, 39, are found dead in their North Epping home. Police investigations over the next six months fail to find clues or culprits. In January 2010, police set up surveillance on Robert Xie: cameras and listening devices installed in his house and car. Still nothing. On May 5, 2011, Robert Xie is arrested and charged with five counts of murder. There was no direct evidence that Robert Xie viciously murdered the five members of his wife’s family nor any credible circumstantial evidence. In the absence of any durable evidence that pointed to Robert Xie, the police exhibited tunnel vision to focus on Xie. The prosecution continued the process and made much of a DNA sample (‘stain 91’) taken from the Xie family home garage floor, 200 metres from the Lin family’s house, the scene of the murders. Expert witnesses provided extensive but conflicting testimony, and in the end, none of them could exclude young Brenda Lin from the DNA sample; but Brenda was overseas on a school excursion at the time of the murders. In stain91 six profiles were found with scores exceeding 4000; they were the six members of the family – yet there were only 5 victims. Since Brenda was out of the country at the time, it is virtually impossible for the DNA in stain91 to have originated from the crime scene. The appeal against his 2017 conviction has been delayed by the Crown, most recently in October 2019 and is expected to be heard in 2020."



-------------------------------------------------------------------

PASSAGE OF THE DAY: "On Monday, the fifth hearing day, Mr Quilter said the murder weapon was never located by police nor seen by any witness. But at trial the Crown successfully applied to adduce "coincidence evidence" asserting similarities between a piece of blood-soaked cloth found in one of the bedrooms, and a folded towel on a homemade massage device found at Xie's home in May 2010. The Crown had contended the murders were committed with a hammer-like implement and that the blood-soaked piece of cloth was at one time secured to the head of the implement with a red rubber band. The massage device featured towelling which was folded over a protruding bolt and secured in place with a red rubber band. Mr Quilter said while the defence agreed Xie constructed the massage device, it very much disputed he constructed a murder weapon. "If the applicant's construction of the massage device, through tendency or coincidence reasoning, was admissible to prove that he affixed the piece of cloth to the murder weapon, then this would be evidence that the applicant committed the murders," Mr Quilter said. The defence argued there was no proper foundation for an inference that the piece of cloth was attached to the murder weapon. Further, the fundamentally different purposes of the two objects - the massage device had a remedial purpose, while the hypothesised murder weapon was used to inflict harm - reduced "the probative value of the coincidence evidence", Mr Quilter said. He said different types of fabric, which were different sizes, were used in the two objects and no meaningful comparison was able to be made of the two rubber bands, other than they were both red. The cloth found at the crime scene had been folded twice, while the hand towel on the massage device was folded at least three times."
-----------------------------------------------------------------------

STORY: "Sydney  family killer's appeal continues," by Associated Press Reporter  Margaret Scheikowski,  published by The Associated Press on June 29, 2020.


PHOTO CAPTION: "Robert Xie  was given five life sentences for murdering three adults and two children."
GIST: "Evidence related to a possible murder weapon shouldn't have been put before the jury at the trial of Robert Xie, who was convicted of bludgeoning five relatives to death, his appeal has been told.
The trial judge erred in admitting "coincidence" evidence in relation to a folded, blood-soaked piece of cloth found at the crime scene and a "massage device" found at Xie's home, his barrister Tom Quilter said.
After four trials, Xie was found guilty in 2017 of murdering three adults and two children in the bedrooms of their Sydney home in the early hours of July 18, 2009.
Xie's newsagent brother-in-law Min Lin, 45, his wife Lily Lin, 43, the couple's sons Henry, 12, and nine-year-old Terry, and Lily's 39-year-old sister, Irene, suffered horrific head injuries when they were attacked with what the Crown contended to be a hammer like object attached to a rope.
The now 56-year-old Xie is behind bars for the rest of his life and is appealing his convictions on seven grounds in the NSW Court of Criminal Appeal.
On Monday, the fifth hearing day, Mr Quilter said the murder weapon was never located by police nor seen by any witness.
But at trial the Crown successfully applied to adduce "coincidence evidence" asserting similarities between a piece of blood-soaked cloth found in one of the bedrooms, and a folded towel on a homemade massage device found at Xie's home in May 2010.
The Crown had contended the murders were committed with a hammer-like implement and that the blood-soaked piece of cloth was at one time secured to the head of the implement with a red rubber band.
The massage device featured towelling which was folded over a protruding bolt and secured in place with a red rubber band.
Mr Quilter said while the defence agreed Xie constructed the massage device, it very much disputed he constructed a murder weapon.
"If the applicant's construction of the massage device, through tendency or coincidence reasoning, was admissible to prove that he affixed the piece of cloth to the murder weapon, then this would be evidence that the applicant committed the murders," Mr Quilter said.
The defence argued there was no proper foundation for an inference that the piece of cloth was attached to the murder weapon.
Further, the fundamentally different purposes of the two objects - the massage device had a remedial purpose, while the hypothesised murder weapon was used to inflict harm - reduced "the probative value of the coincidence evidence", Mr Quilter said.
He said different types of fabric, which were different sizes, were used in the two objects and no meaningful comparison was able to be made of the two rubber bands, other than they were both red.
The cloth found at the crime scene had been folded twice, while the hand towel on the massage device was folded at least three times.
The appeal is continuing."
The entire story can be read at:
https://www.newcastlestar.com.au/story/6811545/sydney-family-killers-appeal-continues/

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog;
-----------------------------------------------------------------
FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."
Lawyer Radha Natarajan:
Executive Director: New England Innocence Project;
------------------------------------------------------------------

Robert Julian-Borchak Williams: Michigan: Aftermath 4: The ACLU (American Civil Liberties Union) goes to bat for an innocent man who was arrested, "because face recognition can't tell black people apart..."Detroit police handcuffed Robert on his front lawn in front of his wife and two terrified girls, ages two and five. The police took him to a detention center about forty minutes away, where he was locked up overnight in a cramped and filthy cell. Robert’s fingerprints, DNA sample, and mugshot were put on file. After an officer acknowledged during an interrogation the next afternoon that “the computer must have gotten it wrong,” Robert was finally released — nearly 30 hours after his arrest. Still, the government continues to stonewall Robert’s repeated attempts to learn more about what led to his wrongful arrest, in violation of a court order and of its obligations under the Michigan Freedom of Information Act."


BACKGROUND: TECHNOLOGY: In the last several years I have been spending considerably more time than usual on applications of rapidly developing technology in the criminal justice process that could effect the  quality of the administration of justice - for better, or, most often, for worse. First, of course, predictive policing (AKA Predpol) made it’s interest, at its most extreme promising  the ability to identify a criminal act before it occurred. At it’s minimal level, it offered police a better sense of where certain crimes where occurring in the community being policed - knowledge that the seasoned beat officer had intuited through every day police work years earlier. Predpol has lost some of it’s lustre as police departments discovered that the expense of acquiring and using the technology was not justified. Then we entered a period where logarithms were become popular with judges for use on bail hearings and on sentencing, In my eyes, these judges were just passing the buck to the machine when they could have, and should have made their decisions  based on information they received in open court - not from Logarithm’s which were noxious for their secrecy, because the manufacturers did not want to reveal their trade secrets - even in a courtroom where an accused person’s liberty and reputation  were on the hook. of these logarithms on bail and sentence have come under attack in many jurisdictions for discriminating against minorities and are hopefully on the way out. Lastly. facial recognition technology has become a concern to this Blog  because of its prove ability to sweep up huge numbers of people, lead to wrongful arrests and prosecutions, and discriminate racially.  May we never forget that  a huge, extremely well-funded, powerful industry, often politically connected industry  is pushing for profit use of all these technologies in the criminal systems - and, hopefully, in the post George Floyd aftermath  will be more concerned with the welfare of the community than their bottom Line. HL.

---------------------------------------------------------------

BACKGROUND: "A lot of technology is pretty dumb, but we think it’s smart. My colleague Kashmir Hill showed the human toll of this mistake. Her article detailed how Robert Julian-Borchak Williams, a black man in Michigan, was accused of shoplifting on the basis of flawed police work that relied on faulty facial recognition technology. The software showed Williams’s driver’s license photo among possible matches with the man in the surveillance images, leading to Williams’s arrest in a crime he didn’t commit. (In response to Kash’s article, prosecutors apologizedfor what happened to Williams and said he could have his case expunged.) Kash talked to me about how this happened, and what the arrest showed about the limits and accuracy of facial recognition technology.

Reporter Shira Ovide. New York Times.

-----------------------------------------------------------------------------------------

PASSAGE OF THE DAY: “Every step the police take after an identification — such as plugging Robert’s driver’s license photo into a poorly executed and rigged photo lineup — is informed by the false identification and tainted by the belief that they already have the culprit,” said Victoria Burton-Harris and Phil Mayor, attorneys representing Robert Williams, in an ACLU blog post published today. “Evidence to the contrary — like the fact that Robert looks markedly unlike the suspect, or that he was leaving work in a town 40 minutes from Detroit at the time of the robbery — is likely to be dismissed, devalued, or simply never sought in the first place…When you add a racist and broken technology to a racist and broken criminal legal system, you get racist and broken outcomes. When you add a perfect technology to a broken and racist legal system, you only automate that system's flaws and render it a more efficient tool of oppression.”

-------------------------------------------

RELEASE: "Man wrongfully arrested because  face recognition can't tell black people apart,"  released by The ACLU (American Civil Liberties Associatio on June 24, 2020.

GIST: "Robert Williams, a Black man and Michigan resident, was wrongfully arrested because of a false face recognition match, according to an administrative complaint filed today by the American Civil Liberties Union of Michigan. This is the first known case of someone being wrongfully arrested in the United States because of this technology, though there are likely many more cases like Robert’s that remain unknown.
Detroit police handcuffed Robert on his front lawn in front of his wife and two terrified girls, ages two and five. The police took him to a detention center about forty minutes away, where he was locked up overnight in a cramped and filthy cell. Robert’s fingerprints, DNA sample, and mugshot were put on file. After an officer acknowledged during an interrogation the next afternoon that “the computer must have gotten it wrong,” Robert was finally released — nearly 30 hours after his arrest. Still, the government continues to stonewall Robert’s repeated attempts to learn more about what led to his wrongful arrest, in violation of a court order and of its obligations under the Michigan Freedom of Information Act.

Robert is keenly aware that his encounter with the police could have proven deadly for a Black man like him. He recounts the whole ordeal in an op-ed published by the Washington Post and a video published by the ACLU.

“I never thought I’d have to explain to my daughters why daddy got arrested,” says Robert Williams in the op-ed. “How does one explain to two young girls that the computer got it wrong, but the police listened to it anyway?”

While Robert was locked up, his wife Melissa had to explain to his boss why Robert wouldn’t show up to work next morning. She also had to explain to their daughters where their dad was and when he would come back. Robert’s daughters have since taken to playing games involving arresting people, and have accused Robert of stealing things from them.

Robert was arrested on suspicion of stealing watches from Shinola, a Detroit watch shop. Detroit police sent an image of the suspect captured by the shop’s surveillance camera to Michigan State Police, who ran the image through its database of driver’s licenses. Face recognition software purchased from DataWorks Plus by Michigan police combed through the driver’s license photos and falsely identified Robert Williams as the suspect.


Based off the erroneous match, Detroit police put Robert’s driver’s license photo in a photo lineup and showed it to the shop’s offsite security consultant, who never witnessed the alleged robbery firsthand. The consultant, based only on a review of the blurry surveillance image, identified Robert as the culprit.

“Every step the police take after an identification — such as plugging Robert’s driver’s license photo into a poorly executed and rigged photo lineup — is informed by the false identification and tainted by the belief that they already have the culprit,” said Victoria Burton-Harris and Phil Mayor, attorneys representing Robert Williams, in an ACLU blog post published today. “Evidence to the contrary — like the fact that Robert looks markedly unlike the suspect, or that he was leaving work in a town 40 minutes from Detroit at the time of the robbery — is likely to be dismissed, devalued, or simply never sought in the first place…When you add a racist and broken technology to a racist and broken criminal legal system, you get racist and broken outcomes. When you add a perfect technology to a broken and racist legal system, you only automate that system's flaws and render it a more efficient tool of oppression.”

Numerous studies, including a recent study by the National Institutes of Science and Technology, have found that face recognition technology is flawed and biased, misidentifying Black and Asian people up to 100 times more often than white people. Despite this, an untold number of law enforcement agencies nationwide are using the technology, often in secret and without any democratic oversight.

“The sheer scope of police face recognition use in this country means that others have almost certainly been — and will continue to be — misidentified, if not arrested and charged for crimes they didn’t commit,” said Clare Garvie, senior associate with Georgetown Law’s Center on Privacy & Technology in an ACLU blog post.

The ACLU has long been warning that face recognition technology is dangerous when right, and dangerous when wrong.

“Even if this technology does become accurate (at the expense of people like me), I don’t want my daughters’ faces to be part of some government database,” adds Williams in his op-ed. “I don’t want cops showing up at their door because they were recorded at a protest the government didn’t like. I don’t want this technology automating and worsening the racist policies we’re protesting.”

The ACLU has also been leading nationwide efforts to defend privacy rights and civil liberties against the growing threat of face recognition surveillance, and is calling on Congress to immediately stop the use and funding of the technology.

“Lawmakers need to stop allowing law enforcement to test their latest tools on our communities, where real people suffer real-life consequences,” said Neema Singh Guliani, ACLU senior legislative counsel. “It's past time for lawmakers to prevent the continued use of this technology. What happened to the Williams family should never happen again.”

Already, multiple localities have banned law enforcement use of face recognition technology as part of ACLU-led campaigns, including San Francisco, Berkeley and Oakland, CA, as well as Cambridge, Springfield, and Somerville, MA. Following years of advocacy by the ACLU and coalition partners, pressure from Congress, and nationwide protests against police brutality, Amazon and Microsoft earlier this month said they will not sell face recognition technology to police for some time.  They joined IBM and Google who previously said they would not be selling a general face recognition algorithm to the government. Microsoft and Amazon have yet to clarify their positions on sale of the technology to federal law enforcement agencies like the FBI and the DEA.

The ACLU is also suing the FBI, DEAICE, and CBP to learn more about how the agencies are using face recognition and what safeguards, if any, are in place to prevent rights violations and abuses. And the organization has taken Clearview AI to court in Illinois over its privacy-violating face recognition practices."

--------------------------------------------


The administrative complaint filed today was first reported by the New York Times: https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html.

A video about the Williams family ordeal is here: https://www.youtube.com/watch?v=Tfgi9A9PfLU&feature=youtu.be.

The administrative complaint filed today is here: https://www.aclu.org/letter/aclu-michigan-complaint-re-use-facial-recognition.

The entire release can be read at: 


PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog;
-----------------------------------------------------------------
FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."
Lawyer Radha Natarajan:
Executive Director: New England Innocence Project;
-------------------------------