PASSAGE ONE OF THE DAY: "When it comes to facial recognition technology, your face may simply not be “yours” anymore. In a report from Georgetown Law’s Center on Privacy and Technology, the Center found more than 117 million adults are part of a “virtual, perpetual lineup,” accessible to law enforcement nationwide. Just think about that for a minute — even though you may not have ever gotten anything more than a speeding ticket, your photo may be part of digital lineup of more than 3 billion faces. Even worse, there are significant concerns about the technology being “biased” — for example, research by the Gender Project in the MIT Media Lab uncovered that the algorithms powering the facial recognition technology are less accurate when it comes to delineating gender, skin type (lighter versus darker skin tones), age, and other attributes (such as ethnicity). In other words, different facial recognition engines seemed to demonstrate what amounts to a racial bias. This should be unacceptable on its face (no pun intended), but in the context of law enforcement use, it is downright dangerous."
--------------------------------------------------------
PASSAGE TWO OF THE DAY: "It appears that this incident may be “the first known account of an American being wrongfully arrested based on a flawed match from a facial recognition algorithm.”Whether it is, however, is not the main point. The bigger issue is how such facial recognition platforms are trained and whether questionable images (such as low-resolution and blurry photos and videos) should even be used with such systems. It also raises the significant additional question: How on earth did the Detroit Police meet the requisite level of probable cause for a warrant to issue for Williams’ arrest given the nature of the photographs and error in facial recognition? I don’t know if I really want an answer to that question, as the answer may be more problematic than I am prepared to accept. The question we all need to ask, however, is whether facial recognition technology should be used in law enforcement without proper checks and balances in discerning its results. Personally, I think
--------------------------------------------------------
PASSAGE TWO OF THE DAY: "It appears that this incident may be “the first known account of an American being wrongfully arrested based on a flawed match from a facial recognition algorithm.”Whether it is, however, is not the main point. The bigger issue is how such facial recognition platforms are trained and whether questionable images (such as low-resolution and blurry photos and videos) should even be used with such systems. It also raises the significant additional question: How on earth did the Detroit Police meet the requisite level of probable cause for a warrant to issue for Williams’ arrest given the nature of the photographs and error in facial recognition? I don’t know if I really want an answer to that question, as the answer may be more problematic than I am prepared to accept. The question we all need to ask, however, is whether facial recognition technology should be used in law enforcement without proper checks and balances in discerning its results. Personally, I think
-----------------------------------------------------------
SUB-HEADING: "With great innovation comes great responsibility."
GIST: "Over the past few years, facial recognition technology has progressed significantly — so much so that it is becoming more and more prevalent in our everyday lives. As I have previously written on this topic, however, this progress has not come without significant concerns over personal privacy and other rights — having your photo as part of a database for servicing of tailored advertising to you is one thing, but being part of a surveillance platform is quite another. Unfortunately, one of my big concerns has come to pass, and it only highlights the deep divisions and concerns over this technology, proving that there is far more to this technology than meets the digital AI.
The incident I refer to occurred back in January of this year and involved a gentleman who seems to have found himself on the wrong side of an algorithm (as opposed to the law). Robert Julian-Borchak Williams, who is African-American, was apparently working at his job at an automotive supply store when he was contacted by the Detroit Police to turn himself in for shoplifting. Believing this call to be a prank (as he did not commit any such crime), he ignored it. Upon his return home from work (apparently merely an hour later), he was boxed in by Detroit police cars as he pulled into his driveway and placed under arrest. Why did this happen to him? You guessed it — the Detroit Police placed surveillance video in front of him claiming it showed him shoplifting at a Shinola store in the trendy Midtown area of Detroit. The problem? It wasn’t Williams. There appears to have been a flawed match of Williams’ face with the surveillance video obtained by the Detroit Police. I don’t know about you, but in this case, “oops” just doesn’t seem to cut it.
When it comes to facial recognition technology, your face may simply not be “yours” anymore. In a report from Georgetown Law’s Center on Privacy and Technology, the Center found more than 117 million adults are part of a “virtual, perpetual lineup,” accessible to law enforcement nationwide. Just think about that for a minute — even though you may not have ever gotten anything more than a speeding ticket, your photo may be part of digital lineup of more than 3 billion faces. Even worse, there are significant concerns about the technology being “biased” — for example, research by the Gender Project in the MIT Media Lab uncovered that the algorithms powering the facial recognition technology are less accurate when it comes to delineating gender, skin type (lighter versus darker skin tones), age, and other attributes (such as ethnicity). In other words, different facial recognition engines seemed to demonstrate what amounts to a racial bias. This should be unacceptable on its face (no pun intended), but in the context of law enforcement use, it is downright dangerous.
Although some of the software referenced in the Gender Project study has progressed to correct such deficiencies, the point here is that it appears undue deference is being given to facial recognition technology output without the necessary checks and balances that should be part of the use of such technology in law enforcement. First and foremost, this technology is evolving — there is absolutely no basis for treating the output from this technology as gospel. Moreover, such technology use by law enforcement can never, ever, replace the need for human intervention and quality detective work. From the facts I have been able to gather, the fuzzy photograph of Williams did not provide an absolute match and, in fact, was easily distinguishable from Williams upon closer inspection. Worse, it’s not like the Detroit Police Department did not voice concerns about facial recognition technology — during a public hearing discussing facial recognition technology use in Detroit, an assistant police chief (who is also African-American) stated that with respect to false positives, it “is absolutely factual, and it’s well-documented.” No matter how cutting-edge the technology, nothing excuses a lack of quality law enforcement follow-up.
Thankfully, this incident has not gone unnoticed — the ACLU filed a formal complaint on behalf of Williams regarding the false arrest. Williams has also provided his own account of his arrest in a Washington Post op-ed, and it should prove a wake-up call to every reasonable American concerning this technology and how it is being used. As a technology lawyer, I have come to appreciate the innovation and vision behind cutting-edge (and especially “disruptive”) technologies. That said, with such innovation comes the responsibility for ensuring not only accuracy in execution but also in its application. This type of technology can prove a useful tool to the law enforcement arsenal, but it should only be regarded as a tool — it is not infallible and it most certainly is not a replacement for good old-fashioned police work. States are also taking notice — for example, this year Washington implemented legislation restricting the use of facial recognition technology in law enforcement. Make no mistake, more legislation and regulation of this technology is on the horizon.
It appears that this incident may be “the first known account of an American being wrongfully arrested based on a flawed match from a facial recognition algorithm.”Whether it is, however, is not the main point. The bigger issue is how such facial recognition platforms are trained and whether questionable images (such as low-resolution and blurry photos and videos) should even be used with such systems. It also raises the significant additional question: How on earth did the Detroit Police meet the requisite level of probable cause for a warrant to issue for Williams’ arrest given the nature of the photographs and error in facial recognition? I don’t know if I really want an answer to that question, as the answer may be more problematic than I am prepared to accept. The question we all need to ask, however, is whether facial recognition technology should be used in law enforcement without proper checks and balances in discerning its results. Personally, I think it is high time that states take additional legislative action to ensure that facial recognition technology is used properly before more damage is done to our rights. But don’t take my word for it – just ask Robert Julian-Borchak Williams."
The entire commentary can be read at:https://abovethelaw.com/2020/06/peekaboo-i-see-you-ii-why-facial-recognition-technology-needs-humans-as-much-as-ai/
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. Harold Levy: Publisher: The Charles Smith Blog;
-----------------------------------------------------------------
FINAL WORD: (Applicable to all of our wrongful conviction cases): "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."
Lawyer Radha Natarajan:
Executive Director: New England Innocence Project;
------------------------------------------------------------------