Sunday, June 21, 2020

Technology Series: Part Three: George Floyd Aftermath: Intrusive surveillance through facial engineering: The focus now goes on Microsoft - a massive company which is under pressure to follow Amazon and IBM stop selling facial recognition to police after George Floyd death, Forbes (Reporter Thomas Brewster)..." After global protests about police violence against the black community, tech companies have stopped supporting cops with technology. Already, Amazon is to stop selling its Rekognition tech to law enforcement and IBM said it too was nixing its surveillance business given the tools have been proven to have a bias against non-white people. Now it’s Microsoft’s turn, according to two of the biggest human and digital rights bodies in America. Though the tech giant has been vocal on its support for the black community, it hasn’t yet changed how or what it sells to police agencies."


QUOTE OF THE DAY: “The world Microsoft seems to want is one where police have an invisible but inescapable surveillance presence in our communities,” wrote Matt Cagle, technology and civil liberties attorney at the ACLU. “Where an infrastructure exists to scan your face and identify you as you walk down the street, go to a protest, attend a place of worship, and participate in public life. Building a surveillance apparatus this big would have severe consequences — chilling demonstrations, fueling a for-profit surveillance industry, and creating racist watchlists that governments and businesses will use for discriminatory ends."

-----------------------------------------------------------------
BACKGROUND: TECHNOLOGY: In the last several years I have been spending considerably more time than usual on applications of rapidly developing technology in the criminal justice process that could effect the  quality of the administration of justice - for better, or, most often, for worse. First, of course, predictive policing (AKA Predpol) made it’s interest, at its most extreme promising  the ability to identify a criminal act before it occurred. At it’s minimal level, it offered police a better sense of where certain crimes where occurring in the community being policed - knowledge that the seasoned beat officer had intuited through every day police work years earlier. Predpol has lost some of it’s lustre as police departments discovered that the expense of acquiring and using the technology was not justified. Then we entered a period where logarithms were become popular with judges for use on bail hearings and on sentencing, In my eyes, these judges were just passing the buck to the machine when they could have, and should have made their decisions  based on information they received in open court - not from Logarithm’s which were noxious for their secrecy, because the manufacturers did not want to reveal their trade secrets - even in a courtroom where an accused person’s liberty and reputation  were on the hook. of these logarithms on bail and sentence have come under attack in many jurisdictions for discriminating against minorities and are hopefully on the way out. Lastly. facial recognition technology has become a concern to this Blog  because of its prove ability to sweep up huge numbers of people and lead to wrongful arrests and prosecutions. May we never forget that  a huge, extremely well-funded, powerful industry, often politically connected industry  is pushing for profit use of all these technologies in the criminal systems - and, hopefully, in the post George Floyd aftermath  will be more concerned with the welfare of the community than their bottom Line. HL.
-------------------------------------------------------------------
PASSAGE OF THE DAY: "Facial recognition research has previously highlighted various racial issues with the tech. A 2019 study by the National Institute of Science and Technology (NIST) suggested that for some algorithms, African American and Asian people were up to 100 times more likely to be misidentified than white males. In more recent news, Microsoft’s AI managed to confuse the faces of different mixed-race members of the famous pop group Little MixOneZero also reported Microsoft’s own employees have been calling on the tech company to stop working with police."

-------------------------------------------------------------------------------------------------.

STORY: "Microsoft urged to follow Amazon and IBM stop selling facial recognition to cops after George Floyd death," by reporter Thomas Brewster, published by Forbes on June 11, 2020.
GIST: After global protests about police violence against the black community, tech companies have stopped supporting cops with technology. Already, Amazon is to stop selling its Rekognition tech to law enforcement and IBM said it too was nixing its surveillance business given the tools have been proven to have a bias against non-white people.
Now it’s Microsoft’s turn, according to two of the biggest human and digital rights bodies in America. Though the tech giant has been vocal on its support for the black community, it hasn’t yet changed how or what it sells to police agencies.
“The world Microsoft seems to want is one where police have an invisible but inescapable surveillance presence in our communities,” wrote Matt Cagle, technology and civil liberties attorney at the ACLU. “Where an infrastructure exists to scan your face and identify you as you walk down the street, go to a protest, attend a place of worship, and participate in public life. Building a surveillance apparatus this big would have severe consequences — chilling demonstrations, fueling a for-profit surveillance industry, and creating racist watchlists that governments and businesses will use for discriminatory ends.
He pointed not just to Microsoft’s sales of facial recognition to police, but also to its backing of a bill supporting the use of the surveillance tech. The bill, known as  AB 2261, was blocked in California last week after strong opposition from ACLU and other groups. The nonprofit warned the bill would allow companies to scan the faces of people applying for jobs or getting financial services and even healthcare.
Facial recognition research has previously highlighted various racial issues with the tech. A 2019 study by the National Institute of Science and Technology (NIST) suggested that for some algorithms, African American and Asian people were up to 100 times more likely to be misidentified than white males. In more recent news, Microsoft’s AI managed to confuse the faces of different mixed-race members of the famous pop group Little MixOneZeroalso reported Microsoft’s own employees have been calling on the tech company to stop working with police..
Action on Amazon Ring too?
The Electronic Frontier Foundation said that not only should Microsoft stop selling facial recognition, but there should be a total ban on the tech across America. And it said Amazon’s Ring business should also cease working with police.
“These partnerships allow police to make batch-requests for footage via email to every resident with a camera within an area of interest to police—potentially giving police a one-step process for requesting footage of protests to identify protestors,” wrote Matthew Guariglia, policy analyst covering surveillance and privacy at the EFF. “These partnerships intensify suspicion, help police racially profile people, and enable and perpetuate police harassment of Black Americans.”
Following the death of George Floyd whilst in the custody of Minneapolis police, protesters calling for justice have faced off against law enforcement and federal agencies equipped with all manner of crowd control and surveillance technologies. Activists on the street are now running the gauntlet of drones, rubber bullets and tear gas to defend freedom of assembly, and to end the militarization of policing."
The entire story can be read  at:
https://www.forbes.com/sites/thomasbrewster/2020/06/11/microsoft-urged-to-follow-amazon-and-ibm-stop-selling-facial-recognition-to-cops-after-george-floyds-death/#24c75a45b6b4

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog;
-----------------------------------------------------------------
FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."
Lawyer Radha Natarajan:
Executive Director: New England Innocence Project;
------------------------------------------------------------------