STORY: "Fingerprint Analysis Is High-Stakes Work — but It Doesn’t Take Much to Qualify as an Expert," by reporter Jordan Smith, published by The Intercept on November 29, 2019. (Jordan Smith is a state and national award-winning investigative journalist based in Austin, Texas. She has covered criminal justice for 20 years and, during that time, has developed a reputation as a resourceful and dogged reporter with a talent for analyzing complex social and legal issues. She is regarded as one of the best investigative reporters in Texas. A longtime staff writer for the Austin Chronicle, her work has also appeared in The Nation, the Crime Report, and Salon, among other places.)
GIST: Brendan Max and
two of his colleagues in the Cook County, Illinois, public defender’s
office got some good news and some bad news in the spring of 2018.
Actually, it was the same news: The three lawyers had nearly aced a
proficiency test designed for fingerprint examiners. None of them had
any training or real expertise in latent fingerprint
analysis — the practice of trying to match a fingerprint collected from
a crime scene to the known print of a suspect — aside from what they’d
learned during their years working criminal defense. So, nominally, it
was good news: Each of them had correctly
identified all but one of the fingerprints contained in the test. But
they were certain this was not a good thing. If they could so easily
pass the test with zero training to guide their analysis, what did that
say about the test’s ability to accurately assess the competency of any
fingerprint examiner, including the six employed by the Chicago Police
Department, whose work they regularly had to vet when defending clients?
Acing the tests, which the CPD examiners regularly did, allowed them
to bolster their credibility in court regarding their conclusions about
matches between a crime scene print and a criminal defendant. But the
lawyers also knew from cross-examinations that these same analysts
appeared to know frighteningly little about their discipline, and they
worked in a lab setting that had none of the written policies or quality
assurance practices designed to keep forensic work well-documented and
reliable. As proficiency testing has become ubiquitous in the forensic
sciences — according to federal data,
98 percent of practitioners working in accredited public crime labs are
proficiency tested — the disconnect Max and his colleagues face in
Chicago raises a series of sobering questions. Not least among them:
What, if anything, do proficiency tests say about the abilities of the
forensic examiners taking them? Startling False Positive Rates: The
release of a groundbreaking report from the National Academy of
Sciences in 2009 threw a harsh light on the state of forensic science.
Aside from DNA analysis, the majority of the forensic disciplines lacked
meaningful scientific underpinning, the report
concluded. This was true for all of the so-called pattern-matching
disciplines, where a practitioner takes a piece of crime scene evidence
and attempts to match it to a pattern known to be associated with a
suspect, a process that is highly subjective. This includes fingerprint,
or friction ridge, analysis, along with things like handwriting
analysis and bite-mark matching. Friction ridge analysis rests on a deceptively simple foundation:
that human fingerprints are unique — an individuality that persists —
and that this uniqueness can be transferred with fidelity to a
substrate, like glass or paper. While experts have long said that no two
prints are the same, there’s no proof that is the case. Moreover, crime
scene prints are often distorted — or, “noisy” — partial prints that
may be smudged or otherwise degraded, which is where errors occur, as in
the infamous case of Brandon Mayfield, the Oregon lawyer who was
wrongly suspected of being involved in the 2004 Madrid train bombing
based on the FBI’s faulty fingerprint analysis.
Implicated in the Mayfield fiasco was a common issue in fingerprint analysis known as a “close non-match.” This is particularly problematic with analyses aided by the Automated Fingerprint Identification System, a database of millions of prints maintained by the FBI. When a latent print is pulled off a piece of evidence — in the Mayfield case, it was lifted from a bag of detonators — but there is no suspect already identified for comparison purposes, an examiner can feed the crime scene print into the system, which generates a list of potential matches based on similar characteristics. While it may or may not be true that no two prints are exactly alike, there are plenty of very similar prints. The National Academy of Sciences report made a host of recommendations for shoring up the validity and reliability of forensic practices. While some practitioners have effectively stuck their heads in the sand, a number in the fingerprint community have heeded the calls for reform by investigating what leads to errors, trying to devise error rates for the discipline, and conducting research into objective techniques for doing their work. Meanwhile, the academy also made a series of broader recommendations, including that crime labs be accredited and practitioners certified and regularly tested for proficiency. It was amid this broad sweep toward reform that Max, chief of the public defender’s forensic science division, and his colleagues Joseph Cavise and Richard Gutierrez started to get interested in the research on fingerprint analysis. "
The entire story can be read at:
Implicated in the Mayfield fiasco was a common issue in fingerprint analysis known as a “close non-match.” This is particularly problematic with analyses aided by the Automated Fingerprint Identification System, a database of millions of prints maintained by the FBI. When a latent print is pulled off a piece of evidence — in the Mayfield case, it was lifted from a bag of detonators — but there is no suspect already identified for comparison purposes, an examiner can feed the crime scene print into the system, which generates a list of potential matches based on similar characteristics. While it may or may not be true that no two prints are exactly alike, there are plenty of very similar prints. The National Academy of Sciences report made a host of recommendations for shoring up the validity and reliability of forensic practices. While some practitioners have effectively stuck their heads in the sand, a number in the fingerprint community have heeded the calls for reform by investigating what leads to errors, trying to devise error rates for the discipline, and conducting research into objective techniques for doing their work. Meanwhile, the academy also made a series of broader recommendations, including that crime labs be accredited and practitioners certified and regularly tested for proficiency. It was amid this broad sweep toward reform that Max, chief of the public defender’s forensic science division, and his colleagues Joseph Cavise and Richard Gutierrez started to get interested in the research on fingerprint analysis. "
The entire story can be read at:
https://theintercept.com/2019/ 11/29/fingerprint-examination- proficiency-test-forensic- science/
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/ charlessmith. Information on "The Charles
Smith Blog Award"- and its nomination process - can be found at:
http://smithforensic.blogspot. com/2011/05/charles-smith- blog-award-nominations.html
Please send any comments or information on other cases and issues of
interest to the readers of this blog to: hlevy15@gmail.com. Harold
Levy: Publisher: The Charles Smith Blog;
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/