Saturday, August 18, 2018

Darrell Siggers; Michigan: (Flawed ballistics): Thanks to ballistics evidence at trial - later found by an independent review to be 'erroneous, 'unbelievable' and 'highly improbable', this Detroit man spent 34 years of wrongful imprisonment before his recent release pending a new trial..."Prosecutors agreed to vacate Siggers’ conviction because the ballistics evidence and witness testimony presented at his trial have since been disputed. At trial, Detroit Police Sgt. Claude Houseworth testified that bullets recovered from the victim, the crime scene and Siggers’ home were discharged from the same gun. The gun was never recovered. Michael Waldo, Siggers’ current attorney who works with the Michigan State Appellate Defender Office, presented testimony from a ballistics expert who found fault with Houseworth’s findings. Waldo wrote in a court document, “In 2015, Mr. David Townshend, a retired Michigan State Trooper with more than 20 years of experience in the Michigan State Police Crime Laboratory Firearms Identification Unit, conducted an independent review. Mr. Townshend determined that Sgt. Houseworth’s conclusions were ‘erroneous,’ ‘unbelievable’ and ‘highly improbable.’ ” David Balash, a retired Michigan State Police firearms examiner, also reviewed the ballistics evidence presented at Siggers’ trial and determined that Houseworth’s testimony was “both confusing and at times totally inaccurate.”


PASSAGE OF THE DAY: "Further testing on the ballistics evidence could not be completed because Detroit police destroyed the evidence in 2003. In 2008, the Detroit Police crime lab was shut down after a State Police audit discovered a high error rate in firearms cases."

STORY: "Detroit Man Released from Prison After 34 Years of Wrongful Incarceration," by Innocence Project Staff, published by The Innocence Project  on August 16, 2018.


GIST: "Darrell Siggers was released from prison after spending 34 years behind bars for a murder he maintains he did not commit. In 1984, Siggers was convicted of murder in Wayne County, Michigan and sentenced to life-in-prison. Prosecutors agreed to vacate Siggers’ conviction because the ballistics evidence and witness testimony presented at his trial have since been disputed. At trial, Detroit Police Sgt. Claude Houseworth testified that bullets recovered from the victim, the crime scene and Siggers’ home were discharged from the same gun. The gun was never recovered. Michael Waldo, Siggers’ current attorney who works with the Michigan State Appellate Defender Office, presented testimony from a ballistics expert who found fault with Houseworth’s findings. Waldo wrote in a court document, “In 2015, Mr. David Townshend, a retired Michigan State Trooper with more than 20 years of experience in the Michigan State Police Crime Laboratory Firearms Identification Unit, conducted an independent review. Mr. Townshend determined that Sgt. Houseworth’s conclusions were ‘erroneous,’ ‘unbelievable’ and ‘highly improbable.’ ” David Balash, a retired Michigan State Police firearms examiner, also reviewed the ballistics evidence presented at Siggers’ trial and determined that Houseworth’s testimony was “both confusing and at times totally inaccurate.”  Further testing on the ballistics evidence could not be completed because Detroit police destroyed the evidence in 2003. In 2008, the Detroit Police crime lab was shut down after a State Police audit discovered a high error rate in firearms cases. Siggers’ re-trial is scheduled for December. Until then, Siggers will be spending time with his three children and eight grandchildren, and will enroll in college courses to advance the associate degree he earned while incarcerated. “Being in prison for 34 years for something that you didn’t do, and then to be free … it’s just an awesome moment,” Siggers, now 54, told the Free Press after his release. “I feel wonderful.”"

The entire story can be found at the link below:
https://www.innocenceproject.org/detroit-man-released-after-34-years-wrongful-incarceration/

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. 
Harold Levy: Publisher; The Charles Smith Blog;
---------------------------------------------------------------------

Technology Series: (Part 11); New York Creates Task Force to Examine Automated Decision MakingTechnology Series..."In December 2017, the New York City Council passed the country's first bill to demand accountability in how algorithms are used in city government. That bill mandated that a task force study how city agencies use algorithms. This task force will report on how to make these algorithms understandable to the public. The original proposal by Council Member James Vacca mandated that the source code for the algorithms be made public. Some policy experts warned that openness might create security risks, or provide a way for people to game the public benefits system. Technology companies argued that they might be required to disclose proprietary information. The disclosure requirement was dropped in favor of the task force."


algorithm

PUBLISHER'S NOTE: Artificial intelligence, once the stuff of science fiction, has become all to real in our modern society - especially in the American criminal justice system; As the ACLU's  Lee Rowland puts it:  "Today, artificial intelligence. It's everywhere — in our homes, in our cars, our offices, and of course online. So maybe it should come as no surprise that government decisions are also being outsourced to computer code. In one Pennsylvania county, for example, child and family services uses digital tools to assess the likelihood that a child is at risk of abuse. Los Angeles contracts with the data giant Palantir to engage in predictive policing, in which algorithms identify residents who might commit future crimes. Local police departments are buying Amazon's facial recognition tool, which can automatically identify people as they go about their lives in public."  The algorithm is finding its place deeper and deeper in the nation's courtrooms on what used to be  exclusive decisions of judges such as bail and even the sentence to be imposed. I am pleased to see that a dialogue has begun on the effect that increasing use of these logarithms in our criminal justice systems is having on our society and on the quality of decision-making inside courtrooms. As Lee Rowland asks about this brave new world,  "What does all this mean for our civil liberties and how do we exercise oversight of an algorithm?" In view of the importance of these issues - and  the increasing use of artificial intelligence by countries for surveillance  of their citizens - it's time for yet another technology series on The Charles Smith Blog focusing on the impact of science on society and  criminal justice. Up to now I have been identifying the appearance of these technologies. Now at last I can report on the realization that some of them may be two-edged swords - and on growing  pushback.

Harold Levy: Publisher; The Charles Smith Blog:

------------------------------------------------------------

PASSAGE OF THE DAY: "The investigation of bias and infringement on rights in algorithmic decision making is only beginning. Predictive policing programs in Chicago and New Orleans are being scrutinized for violations of due process and privacy. The public is often unaware of the use of these tools. Even the creators of algorithms often cannot fully explain how the software came to the conclusion that was reached. Several city agencies are starting to use decision systems. The Fire Department uses the Risk-Based Inspection System (RBIS) to predict where fires might start. Part of the RBIS is the Fire Cast tool that uses data from five city agencies to analyze 60 risk factors to predict which buildings are most vulnerable to fire outbreaks. These buildings are then prioritized for inspections, the data being available to all the city's 49 fire companies. The Police Department uses algorithms for the data obtained from body cameras and facial recognition. Algorithms are also used by the Department of Transportation, the Mayor's Office of Criminal Justice, the Department of Education, and the Department of Social Services. Students are matched with schools. Teacher performance is assessed. Medicare fraud is investigated."

--------------------------------------------------------------------

STORY: "New York Creates Task Force to Examine Automated Decision Making," by reporter Michael Stiefel, published by InfoQ on Juky 31, 2018.

GIST: "New York City has created an Automated Decision Systems Task Force to demand accountability and transparency in how algorithms are used in city government. The final report of the task force is due in December 2019. This task force is the first in the United States to study this issue. Background; In December 2017, the New York City Council passed the country's first bill to demand accountability in how algorithms are used in city government. That bill mandated that a task force study how city agencies use algorithms. This task force will report on how to make these algorithms understandable to the public. The original proposal by Council Member James Vacca mandated that the source code for the algorithms be made public. Some policy experts warned that openness might create security risks, or provide a way for people to game the public benefits system. Technology companies argued that they might be required to disclose proprietary information. The disclosure requirement was dropped in favor of the task force. What the Law States: An automated decision system is defined as "a computerized implementations of algorithms, including those derived from machine learning or other data processing or artificial intelligence techniques, which are used to make or assist in making decisions." The law requires the task force to accomplish at least six goals in their final report. They need to identify which city agencies should be subject to review. They need to recommend procedures so that people affected by an algorithmic decision can request an explanation upon what the decision was based, as well as how adverse impacts can be addressed. They also should explain the development and implementation of a procedure in which the city may determine if an automated decision system used by a city agency "disproportionately impacts persons based upon age, race, creed, color, religion, national origin, gender, disability, marital status, partnership status, caregiver status, sexual orientation, alienage or citizenship status". Recommendations for processes for making information available for automated decision systems will allow the public to meaningfully assess how they work, and are used by the city, as well as the feasibility of archiving automated decisions and the data used. The members of this task force are not limited to experts in algorithmic design and implementation, but can include people who understand the impact of algorithms on society. Meeting participants can be limited if it "would violate local, state or federal law, interfere with a law enforcement investigation or operations, compromise public health or safety, or result in the disclosure of proprietary information." While the final report should be publicly available, no recommendation is required if it "would violate local, state, or federal law, interfere with a law enforcement investigation or operations, compromise public health or safety, or would result in the disclosure of proprietary information." This task force has no legal authority to force, or penalize city agencies that do not comply with their recommendations. Background to the Controversy; The investigation of bias and infringement on rights in algorithmic decision making is only beginning. Predictive policing programs in Chicago and New Orleans are being scrutinized for violations of due process and privacy. The public is often unaware of the use of these tools. Even the creators of algorithms often cannot fully explain how the software came to the conclusion that was reached. Several city agencies are starting to use decision systems. The Fire Department uses the Risk-Based Inspection System (RBIS) to predict where fires might start. Part of the RBIS is the Fire Cast tool that uses data from five city agencies to analyze 60 risk factors to predict which buildings are most vulnerable to fire outbreaks. These buildings are then prioritized for inspections, the data being available to all the city's 49 fire companies. The Police Department uses algorithms for the data obtained from body cameras and facial recognition. Algorithms are also used by the Department of Transportation, the Mayor's Office of Criminal Justice, the Department of Education, and the Department of Social Services. Students are matched with schools. Teacher performance is assessed. Medicare fraud is investigated; Problems with the Current Legislation; Julia Powels, a research fellow at NYU’s Information Law Institute as well as at Cornell Tech, described two problems with the task force's mission which resulted from a compromise between the original legislation and what passed. First, if the agencies and contractors do not cooperate, good recommendations will not be made. There is no easily accessible information on how much the City of New York spends on algorithmic services, or how much of the data used is shared with outside contractors.  The Mayor's office rejected any requirement for mandated reporting to be in the legislation based on the argument that it would reveal proprietary information. If too much leeway is given to claims of corporate secrecy, there will be no algorithmic transparency. The other problem with the current law is that it is unclear how the city can change the behavior of companies that create automated-decision making systems. Frank Pasquale, a law professor at the University of Maryland, argues that the city has more leverage than the vendors. Members of the Task Force; The members of this task force are not limited to experts in algorithmic design and implementation, but can include people who understand the impact of algorithms on society.  It will be composed of individuals from city agencies, academia, law, industry experts, and nonprofits and think tanks. It is expected that representatives will be chosen from the Department of Social Services, the Police Department, the Department of Transportation, the Mayor’s Office of Criminal Justice, the Administration for Children’s Services, and the Department of Education. The task force is co-chaired by Emily W. Newman, acting director of the Mayor’s Office of Operations, and Brittny Saunders, deputy commissioner for strategic initiatives at the Commission on Human Rights.   Impact: New York City could have an impact with algorithms similar to California with auto emission standards. Being one of the largest cities in the world, it may make wide enough use of algorithms such that it might be easier to meet whatever standards it creates in all jurisdictions.  Altering algorithms for different locations, however, might be easier with software than mechanical devices. This is illustrated by the ability of software to calculate different sales tax regulations in different states, cities, towns, counties, etc. through out the United States. On the other hand, New York is one of the most valuable sources of demographic data in the world. Restricting the use here might encourage other locations to do the same. In any case, the argument over the fairness of algorithmic decisions, and the need to use them, is not going away."

The entire story can be read at:

Programming code abstract screen of software developer.

 PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. 

Harold Levy: Publisher; The Charles Smith Blog;

---------------------------------------------------------------------

Friday, August 17, 2018

New Mexico: Deceased civil rights Attorney Mary Han: Major Development: A judge has ordered the New Mexico Medical Investigator's office to change the attorney’s death certificate from "suicide" to "undetermined, in response to a legal action launched by Han's sister (Elizabeth Wallbro)..."First Judicial District Judge David Thomson in Santa Fe ruled, in part, that the Albuquerque Police Department’s investigation into the death of attorney Mary Han was so flawed that OMI couldn’t reach a conclusion of suicide and should make changes to her death certificate. The judge said OMI’s conclusion was “arbitrary and capricious.”..."Wallbro’s attorneys said that more than 50 police officers and upper-level city and Albuquerque police officials turned the death scene into a “surreal circus,” in which high-ranking officials trooped through Han’s home, drank water in the kitchen, used her bathroom and handled items in her home instead of preserving evidence. Han on many occasions during her career had filed lawsuits against the police department. “Due to the contamination of the scene and the loss of critical evidence, even under the most deferential standard, there is no basis for (OMI’s) determination of the manner of death,” Thomson wrote in his 96-page opinion filed Wednesday. “Simply put, the evidence needed to make this determination was spoiled by the acts of the investigating agency.”


PUBLISHER'S NOTE: Congrats to lawyers Rosario Vega Lynn and Diane Garrity who had to fight so hard so  to clear the record on how the prominent civil rights attorney died - in spite of a wall of resistance at every step of the way from the police  and prosecutorial authorities. This matter should not end here. In light of Judge Thomson's  decision, the state should order a public inquiry into the cover-up surrounding the terribly flawed investigation of Mary Han's death - and the officer's responsible should be held accountable for their misconduct.'' There are still far too many unanswered questions.

Harold Levy:  Publisher:  The Charles Smith Blog.

-------------------------------------------------------------

 QUOTE OF THE DAY: "The degree and extent of the collusion and obscuring of truth by the (OMI) and the (APD) will never be known, but at least the medical investigator’s office has been held accountable for their decisions and must – from this point forward – justify the bases for their decisions,” Rosario Vega Lynn and Garrity, Wallbro’s attorneys, said in a prepared statement."

-------------------------------------------------------------

PASSAGE OF THE DAY: "Paul Kennedy, Han’s former law partner, found Han dead in a car parked in the garage of her North Valley Albuquerque home in November 2010. OMI’s report of death concluded Han died by inhaling carbon monoxide in a closed garage. The cause was ruled as a suicide. In court filings, Han’s sister said she had spoken with Mary Han the night before her body was found and she didn’t seem out of character. Han also had plans to visit her only child the week after her death, and no note indicating suicide was found. Wallbro’s petition alleged that the police failed to test carbon monoxide levels inside Han’s home, her fingernails for DNA evidence and some of the items found at the scene of her death, such as a glass of liquid thought to be vodka. It also questioned why no attempt was made to explain why she was found inside a car that wasn’t running and no neighbors were interviewed. Those were just some of the problems with Albuquerque police’s investigation into her death, according to court documents.There were also questions about the thoroughness of OMI’s autopsy, said Diane Garrity, one of Wallbro’s attorneys. Thomson’s ruling said while some responding Albuquerque police officers tried to follow proper protocols, the scene was “overrun for some unknown reason by other responding individuals both civilian and supervisors at APD.”

---------------------------------------------------------------

STORY: "Judge orders OMI (New Mexico Office of the Medical Investigator) to change attorney’s death certificate," by reporter Ryan Boetel, published by The Albuquerque Journal on August 15, 2018.


GIST: "In a rare move, a state District Court judge has ordered the New Mexico Office of the Medical Investigator to change its conclusion about the manner of death of a prominent Albuquerque attorney from “suicide” to “undetermined.” First Judicial District Judge David Thomson in Santa Fe ruled, in part, that the Albuquerque Police Department’s investigation into the death of attorney Mary Han was so flawed that OMI couldn’t reach a conclusion of suicide and should make changes to her death certificate. The judge said OMI’s conclusion was “arbitrary and capricious.” Alex Sanchez, a spokeswoman for the University of New Mexico Health Sciences Center, which OMI is a part of, said the agency disagreed with the judge’s ruling and is considering an appeal. Thomson’s ruling was in response to a non-jury trial held back in January 2017. Elizabeth Wallbro, Han’s sister, had brought a petition for a writ of mandamus seeking to have the manner of death changed. Wallbro’s attorneys said that more than 50 police officers and upper-level city and Albuquerque police officials turned the death scene into a “surreal circus,” in which high-ranking officials trooped through Han’s home, drank water in the kitchen, used her bathroom and handled items in her home instead of preserving evidence. Han on many occasions during her career had filed lawsuits against the police department.
“Due to the contamination of the scene and the loss of critical evidence, even under the most deferential standard, there is no basis for (OMI’s) determination of the manner of death,” Thomson wrote in his 96-page opinion filed Wednesday. “Simply put, the evidence needed to make this determination was spoiled by the acts of the investigating agency.” But Sanchez said in a prepared statement: “We have the highest respect for the court but disagree with the judge’s ruling in this case. Our experts conducted a complete and thorough investigation into Ms. Han’s death and we stand by the autopsy determination that Ms. Han died as a result of a suicide.” Paul Kennedy, Han’s former law partner, found Han dead in a car parked in the garage of her North Valley Albuquerque home in November 2010. OMI’s report of death concluded Han died by inhaling carbon monoxide in a closed garage. The cause was ruled as a suicide. In court filings, Han’s sister said she had spoken with Mary Han the night before her body was found and she didn’t seem out of character. Han also had plans to visit her only child the week after her death, and no note indicating suicide was found. Wallbro’s petition alleged that the police failed to test carbon monoxide levels inside Han’s home, her fingernails for DNA evidence and some of the items found at the scene of her death, such as a glass of liquid thought to be vodka. It also questioned why no attempt was made to explain why she was found inside a car that wasn’t running and no neighbors were interviewed. Those were just some of the problems with Albuquerque police’s investigation into her death, according to court documents. There were also questions about the thoroughness of OMI’s autopsy, said Diane Garrity, one of Wallbro’s attorneys. Thomson’s ruling said while some responding Albuquerque police officers tried to follow proper protocols, the scene was “overrun for some unknown reason by other responding individuals both civilian and supervisors at APD.” “The degree and extent of the collusion and obscuring of truth by the (OMI) and the (APD) will never be known, but at least the medical investigator’s office has been held accountable for their decisions and must – from this point forward – justify the bases for their decisions,” Rosario Vega Lynn and Garrity, Wallbro’s attorneys, said in a prepared statement. Garrity said there have been few cases in which judges have ordered OMI to make changes to a death certificate. She said the case is important because it shows families they have a way to challenge OMI if they disagree with the office’s findings. Garrity said the experts who testified at trial believe that the manner of Han’s death will never be conclusive. “The experts believe that the crime scene was so contaminated and the contamination was so acute that we’ll probably never be able to say conclusively what happened,” she said.

The entire story can be read at:

https://www.abqjournal.com/1209248/judge-orders-lawyer-mary-hans-death-to-be-changed-to-undetermined.html

For background, read the earlier story from the Albuquerque Journal - Famed pathologist questions Han ruling -  (reporter Joline Guttierez Krueger)  at the link below (August 5, 2015): "Last month, famed forensic pathologist Dr. Werner Spitz, who has made a laudable living off the dead for more than 62 years, offered his thoughts on the investigation into the death of prominent civil rights attorney Mary Han in Albuquerque nearly five years ago. Spitz co-wrote “Medicolegal Investigation of Death,” the bible of forensic pathology, now in its 20th printing. His expertise has been sought in dozens of high-profile cases from the assassination of President Kennedy to the murder trials of O.J. Simpson and Casey Anthony. And now, it had been sought by the Han family. His opinion was stunning. “It is evident that the investigation was cursory, superficial and incomplete, both by the medical investigator and the police,” Spitz wrote in a letter dated July 24 to Rosario Vega Lynn, the Han family’s attorney. “Under the circumstances, I firmly believe that this case should be reopened, re-examined and re-evaluated.” Which is what many – including two other medical experts, former Attorney General Gary King and yours truly – have been saying almost since that gut-kicking afternoon when Han was found dead Nov. 18, 2010, in her car parked in the garage at her North Valley townhouse. The state Office of the Medical Investigator deemed her death a suicide by carbon monoxide intoxication, but few who had known the fearless, feisty, 53-year-old Han believed she had taken her own life. There were just too many questions unanswered, too much left undone by the agencies that should have sought those answers. A lawsuit filed by the Han family in November 2012 against the city of Albuquerque and various personnel contends that the Albuquerque Police Department – an agency Han had sued for millions of dollars – failed to conduct a thorough and appropriate investigation into her death and allowed dozens of high-ranking police and city officials to tromp around her home, destroying evidence and leading the OMI astray. U.S. District Court Magistrate Judge Carmen Garza dismissed the lawsuit in August 2014, ruling that there is “no fundamental right under the Constitution to know the cause of a family member’s death.” Garza also ruled that a dead person’s civil rights cannot be violated, that the family had not shown Han’s civil rights were violated before she died, nor had the civil rights of the family members been violated. And the family, Garza said, had not identified the wrongful act that caused Han’s death. The case remains under appeal, split into two portions now before the 10th U.S. Circuit Court of Appeals and state District Court in Albuquerque. Perhaps the worst indignity argued at the heart of these legal actions is the label of suicide when even the OMI has acknowledged that the investigation into Han’s death was incomplete at best. “The report certainly identified shortcomings in the original scene investigation and difficulties getting full information about some of the circumstances surrounding the death,” then-chief medical investigator Dr. Ross Zumwalt wrote in a letter dated Dec. 13, 2013, to the Attorney General’s Office. Zumwalt’s letter came four months after the AG had released its own scathing report on the Han investigation, calling it “terribly mishandled” and contending that suicide was a mischaracterization. But Zumwalt refused to change Han’s manner of death from suicide to “undetermined,” saying that suicide was still proper “within reasonable medical probability.” Zumwalt and his successor, Dr. Kurt Nolte, have repeatedly rebuffed efforts to change the manner of death. A formal appeal from the Han family in August 2011 went unanswered. The OMI was also unmoved by the opinions, which were requested by the Han family, of Georgia Chief Medical Examiner Kris Sperry in 2012; Dr. David Williams, a board-certified emergency room physician with the Heart Hospital of New Mexico, in 2015; and Spitz this July, who in his letter to the Han family attorney called the suicide determination a “rush to judgment.” So on Monday the Han family filed a petition asking the court to compel the OMI to conduct a “fair, accurate, professional and impartial investigation into Ms. Han’s true cause and manner of death and to provide an accurate death certification.” The petition, filed in the 1st Judicial Court in Santa Fe because the OMI is a state agency, makes a number of allegations against the OMI, including that its field personnel did not communicate with first responders, who had identified Han’s home as a possible crime scene; that it failed to investigate the contents of Han’s laptop found in her car; failed to interview Han’s longtime law partner, Paul Kennedy, who found Han dead; failed to test the clear liquid found in a glass in the car or test the air for carbon monoxide; failed to collect medication found in the home or collect evidence and DNA that might have been found under her fingernails or on her body; and failed to contact an expert on whether Han’s car had a carbon monoxide sensor. Nolte, named chief medical investigator in January, did not respond to a request to discuss the Han case, but John Arnold, a spokesman with the University of New Mexico Health Sciences Center, under which the OMI operates, offered this: “OMI strives to conduct a fair and accurate investigation for every case it reviews. This case is no exception. We have not been served with the petition, so cannot comment on it specifically.” The noble cause of a forensic pathologist is to seek the truth, says the foreword to Spitz’s seminal “Medicolegal Investigation of Death.” That requires the pathologist to “abandon rhetoric, ancient dogma and fictive contentions in favor of finding and presenting fact.” The facts in Han’s death appear to be irretrievably lost, the truth forever out of reach. Surely it’s time the OMI found the guts to say so and change its report."







The entire story can be read at the link below:

 https://www.abqjournal.com/623309/famed-pathologist-questions-han-ruling-says-case-should-be-reopened.html?utm_source=abqjournal.com&utm_medium=related+posts+-+default&utm_campaign=related+posts

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.

Harold Levy: Publisher; The Charles Smith Blog;

---------------------------------------------------------------------

Joe Bryan: Texas: Troubling blood splatter case: Up-coming hearing: (August 20, 2018): Pamela Colloff emails readers of this excellent New York Times/ProPublica investigation that this will be, "the most significant milestone in Joe Bryan's case since 1989."




Pamela Colloff's email to readers of "Blood Will Tell: "Hi everyone. I hope you're having a good summer. I wanted to quickly touch base with you before next week, when Joe Bryan will be back in court for the first time in 29 years. Today, both the Waco Tribune-Herald—Joe's hometown paper—and the Austin American-Statesman  ran a powerful op-ed by the Innocence Project of Texas's executive director, Mike Ware, in which he called for additional DNA testing to take place in Joe's case. Ware aimed his criticism squarely at Bosque County District Attorney Adam Sibley, writing:

"Sibley has steadfastly refused to consider the possibility of Bryan's innocence, opposing DNA testing of crucial evidence. Someone murdered Joe Bryan's wife, and that person's DNA may be found on this evidence — evidence that may both identify the true murderer and add to the strong case of Bryan's innocence." "Texas has had over 50 DNA exonerations, with almost half of them from Dallas County. Many of the Dallas County exonorees had their initial requests for DNA testing thwarted...and delayed by prosecutors using the same tactics Sibley is using."

The op-ed comes less than a week before the most significant milestone in Joe's case since 1989, when he was convicted at his retrial of the murder of his wife, Mickey. That milestone is a three-day evidentiary hearing that will begin on Monday, August 20. His attorneys, Walter Reaves (who is a board member of the Innocence Project of Texas) and Jessica Freud, will address the claims they raised in Joe's application for habeas corpus. In plain English, they will make the argument that their client deserves a new trial. The hearing will take place in Comanche, Texas, where Joe's retrial was held. The legal proceeding will resemble a short trial, with various witnesses taking the stand, and the state and defense questioning them. Joe will also be present. Unlike a trial, however, the defense will present first, and the burden of proof will rest on the defense. Visiting Judge Doug Shaver will be asked to determine whether the bloodstain-pattern analysis presented at trial was accurate, and if not, what impact that had on the verdict. Last month, the Texas Forensic Science Commission found that the the bloodstain-pattern analysis used to convict Joe was "not accurate or scientifically supported" and the expert who testified was "entirely wrong." Reaves and Freud are also expected to call witnesses who will testify about other claims, such as their contention that the state's use of a special prosecutor—who was hired by the victim's family—was improper; that prosecutors failed to turn over evidence that pointed to other possible perpetrators; and that newly discovered evidence suggests that Dennis Dunlap, who committed another murder in Clifton months before Mickey Bryan's murder, should be considered an alternative suspect in the killing. After the hearing, Shaver will make recommendations to the Texas Court of Criminal Appeals, whose justices will ultimately decide if Joe should be granted a new trial. That process will likely take months, and could drag into 2019. The defense's request for DNA testing was granted last year by Judge James Morgan, who presided over both of Joe's trials. But the Bosque County D.A.'s office appealed the decision to Texas' 11th Court of Appeals, which has not yet handed down an opinion. I'll be covering the hearing next week, and will send you a dispatch from Comanche. Look for more from me soon! Best, Pamela."

Next week he heads back to court.

Next week he heads back to court.


PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. 
Harold Levy: Publisher; The Charles Smith Blog;

---------------------------------------------------------------------






























































































Christopher Dodrill: Virginia; Major Development: 'Shaken Baby Syndrome'...Another SBS case thrown out: This time with the assistance of the West Virginia College of Law Innocence Project..."Judge David W. Hummell, Jr., of the Tyler County Circuit Court found that Dodrill’s trial counsel was ineffective for failing to consult with or hire a defense expert when his client was on trial and subsequently reversed the conviction. “This is the second case in which a West Virginia Innocence Project client has been freed because defense attorneys did not investigate the controversial diagnosis of Shaken Baby Syndrome, and experts have found and supported an alternative cause of injury,” said Valena Beety, professor of law and director of the West Virginia Innocence Project. “Chris should never have served time for a crime he did not commit, but at least his case shines a light on controversial and faulty Shaken Baby Syndrome prosecutions in our state.”


PASSAGE OF THE DAY:  "Dodrill was found guilty of child abuse with serious bodily injury and unlawful assault after a child under his care became injured. He was sentenced to three-to-15 years in prison. Dodrill consistently testified that the child fell and hit her head, and that he took her to the hospital. The child fully recovered, but because she had brain swelling and a subdural hematoma, the hospital diagnosed her with Shaken Baby Syndrome without eliminating other possible causes of her injuries. At his trial, Dodrill had no doctors or expert witnesses to consult or testify on his behalf, making it his word against the hospital’s diagnosis.  On Dodrill’s behalf, the WVIP submitted to the court reports from a biomechanical expert and a pediatric neurologist, as well as depositions of the state’s primary expert and defense counsel. They proved that the child under Dodrill’s care had underlying health issues made worse by the fall and that Dodrill did not cause her injuries."

---------------------------------------------------------------

STORY: "West Virginia Innocence Project client freed from prison," by James Jolley, published by The West Virginia Innocent Project on August 16, 2018.

SUB-HEADING: "Christopher Dodrill's conviction was overturned with the help of the WVU College of Law Innocence Project."

GIST: "The Innocence Project law clinic at the West Virginia University College of Law has helped free a client from prison after proving he was convicted of a crime he did not commit. The Circuit Court for Tyler County recently vacated the conviction of Christopher Dodrill. In 2016, Dodrill was found guilty of child abuse with serious bodily injury and unlawful assault after a child under his care became injured. He was sentenced to three-to-15 years in prison. Dodrill consistently testified that the child fell and hit her head, and that he took her to the hospital. The child fully recovered, but because she had brain swelling and a subdural hematoma, the hospital diagnosed her with Shaken Baby Syndrome without eliminating other possible causes of her injuries. At his trial, Dodrill had no doctors or expert witnesses to consult or testify on his behalf, making it his word against the hospital’s diagnosis.  On Dodrill’s behalf, the WVIP submitted to the court reports from a biomechanical expert and a pediatric neurologist, as well as depositions of the state’s primary expert and defense counsel. They proved that the child under Dodrill’s care had underlying health issues made worse by the fall and that Dodrill did not cause her injuries. Judge David W. Hummell, Jr., of the Tyler County Circuit Court found that Dodrill’s trial counsel was ineffective for failing to consult with or hire a defense expert when his client was on trial and subsequently reversed the conviction. “This is the second case in which a West Virginia Innocence Project client has been freed because defense attorneys did not investigate the controversial diagnosis of Shaken Baby Syndrome, and experts have found and supported an alternative cause of injury,” said Valena Beety, professor of law and director of the West Virginia Innocence Project. “Chris should never have served time for a crime he did not commit, but at least his case shines a light on controversial and faulty Shaken Baby Syndrome prosecutions in our state.” Dodrill’s freedom is a result of work done by WVIP student-attorneys Cody Swearingen ‘17, Taylor Coplin ‘17, Zack Gray ‘18 and Britlyn Seitz ‘18, and supervising attorney Melissa Giggenbach. The students consulted extensively with experts and conducted depositions with the state’s medical expert, trial counsel and trial counsel’s supervisor. "Overturning this conviction took more than two years and the work of two teams of dedicated clinical law students,” said Giggenbach. “Hopefully, with this success, we can stop wrongful convictions based on faulty and misleading science." “The work I did with WVIP was some of the most rewarding I’ve done in my life. I’m thankful that the West Virginia justice system was able to see the controversy behind Shaken Baby Syndrome, and more specifically, the constitutional issues that plagued Mr. Dodrill’s trial,” said Swearingen, who is now lawyer in the U.S. Navy Judge Advocate General’s Corps. The West Virginia Innocence Project Law Clinic is funded, in part, by Wilson, Frame  and Metheney, PLLC."

The entire story ca be read at:
https://wvutoday.wvu.edu/stories/2018/08/16/west-virginia-innocence-project-client-freed-from-prison?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+wvu%2FilPI+%28Press+Releases%29

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.

Harold Levy: Publisher; The Charles Smith Blog;

---------------------------------------------------------------------

Technology Series: (Part 10): China's aggressive surveillance technology will spread beyond its borders, Slate reports. (Reporters Daniel Benaim and Hollie Russon Gilman)..."Today, a new wave of Chinese-led technological advances threatens to blossom into what we consider an "Arab spring in reverse"—in which the next digital wave shifts the pendulum back, enabling state domination and repression at a staggering scale and algorithmic effectiveness. Americans are absolutely right to be urgently focused on countering Russian weaponized hacking and leaking as its primary beneficiary sits in the Oval Office. But we also need to be more proactive in countering the tools of algorithmic authoritarianism that will shape the worldwide future of individual freedom."


algorithm

PUBLISHER'S NOTE: Artificial intelligence, once the stuff of science fiction, has become all to real in our modern society - especially in the American criminal justice system; As the ACLU's  Lee Rowland puts it:  "Today, artificial intelligence. It's everywhere — in our homes, in our cars, our offices, and of course online. So maybe it should come as no surprise that government decisions are also being outsourced to computer code. In one Pennsylvania county, for example, child and family services uses digital tools to assess the likelihood that a child is at risk of abuse. Los Angeles contracts with the data giant Palantir to engage in predictive policing, in which algorithms identify residents who might commit future crimes. Local police departments are buying Amazon's facial recognition tool, which can automatically identify people as they go about their lives in public."  The algorithm is finding its place deeper and deeper in the nation's courtrooms on what used to be  exclusive decisions of judges such as bail and even the sentence to be imposed. I am pleased to see that a dialogue has begun on the effect that increasing use of these logarithms in our criminal justice systems is having on our society and on the quality of decision-making inside courtrooms. As Lee Rowland asks about this brave new world,  "What does all this mean for our civil liberties and how do we exercise oversight of an algorithm?" In view of the importance of these issues - and  the increasing use of artificial intelligence by countries for surveillance  of their citizens - it's time for yet another technology series on The Charles Smith Blog focusing on the impact of science on society and  criminal justice. Up to now I have been identifying the appearance of these technologies. Now at last I can report on the realization that some of them may be two-edged swords - and on growing  pushback.


Harold Levy: Publisher; The Charles Smith Blog:

------------------------------------------------------------

PASSAGE OF THE DAY: "Part of what makes technologically enabled authoritarianism so complex is that the tools also have immense promise to serve customers and citizens well. They're double-edged swords. Consider the surreal case of the California-based suspected serial killer apprehended after his relative voluntarily submitted DNA to an online ancestry database that matched material at the crime scene—or the accused Maryland newsroom shooter quickly identified by facial recognition. Or the lower-caste Indians who can now receive government benefits thanks to India's national ID Aadhaar program, which relies on a database that has collected the iris scans and fingerprints of more than 1 billion Indians in a country where hundreds of millions previously lacked state identity cards. Or the Londoners kept safe by massive numbers of CCTV cameras. Or the predictive policing pilot launched in New Orleans with the pro-bono help of Palantir. Even in democracies with meaningful legal checks on state power, leveraging A.I. for policing often suffers from a lack of transparency, citizen input, and a serious risk of biased enforcement and overreach. China and a few small, advanced, authoritarian states such as Singapore (which is soliciting Chinese bids to install 110,000 advanced facial recognition sensors on the small city-state's lampposts) and the United Arab Emirates are at the forefront of the application of these technologies. But as China embarks on a trillion-dollar global infrastructure construction binge known as the Belt and Road Initiative, it is already exporting its own tech-enabled authoritarian toolkit to gain profit or goodwill with local authorities, or simply to extend the reach of its own surveillance. What happens when these technologies migrate to bigger, more fractious societies?

-------------------------------------------------------------------

PASSAGE OF THE DAY: (2): "China and other determined authoritarian states may prove undeterrable in their zeal to adopt repressive technologies. A more realistic goal, as Georgetown University scholar Nicholas Wright has argued, is to sway countries on the fence by pointing out the reputational costs of repression and supporting those who are advocating for civil liberties in this domain within their own countries. Democracy promoters (which we hope will one day again include the White House) will also want to recognize the coming changes to the authoritarian public sphere. They can start now in helping vulnerable populations and civil society to gain greater technological literacy to advocate for their rights in new domains. It is not too early for governments and civil society groups alike to study what technological and tactical countermeasures exist to circumvent and disrupt new authoritarian tools. Everyone will have to approach these developments with the humbling recognition that Silicon Valley is not the only game in town."

------------------------------------------------------------------------

STORY: "China's Aggressive Surveillance Technology Will Spread Beyond Its Borders," by

SUB-HEADING:  "Now is the time for liberal democracies to grapple with the privacy-invading power of algorithmic authoritarianism."

PHOTO CAPTION: "A police officer wears a pair of smart glasses with a facial recognition system at Zhengzhou East Railway Station in China's central Henan province on Feb. 5."
 
GIST:  "The Chinese government has wholeheartedly embraced surveillance technology to exercise control over its citizenry in ways both big and small. It's facial-scanning passers-by to arrest criminals at train stations, gas pumps, and sports stadiums and broadcasting the names of individual jaywalkers. Government-maintained social credit scores affect Chinese citizens' rights and privileges if they associate with dissidents. In Tibet and Xinjiang, the government is using facial recognition and big data to surveil the physical movements of ethnic minorities, individually and collectively, to predict and police demonstrations before they even start. China is even using facial recognition to prevent the overuse of toilet paper in some public bathroom. We may soon see dictators in other countries use these sorts of tools, too. If American cities and states are laboratories of democracy, China's remote provinces have become laboratories of authoritarianism. China is now exporting internationally a suite of surveillance, facial recognition, and data tools that together equip governments to repress citizens on a scale and with a ruthless algorithmic effectiveness that previous generations of strongmen could only dream of. Call it algorithmic authoritarianism. Where yesterday's strongmen were constrained by individual informants and case-by-case sleuthing, tomorrow's authoritarians will, like China, be able to remotely identify thousands of specific individuals in public via cameras, constantly track them, and use unprecedented artificial intelligence and computing to crunch surveillance information and feed it back into the field in real time. This technology is still being imperfectly and inconsistently applied, but China is working to close the gaps. And even the perception of surveillance where it doesn't exist has been shown to shape behavior. The limits of China's willingness to use these tools at home or export them to others are unknown. Worse still, China's digital authoritarianism could emerge as an exportable model, appealing to leaders on the fence about democratic norms, that could undercut or even rival liberal democracy. Part of what makes technologically enabled authoritarianism so complex is that the tools also have immense promise to serve customers and citizens well. They're double-edged swords. Consider the surreal case of the California-based suspected serial killer apprehended after his relative voluntarily submitted DNA to an online ancestry database that matched material at the crime scene—or the accused Maryland newsroom shooter quickly identified by facial recognition. Or the lower-caste Indians who can now receive government benefits thanks to India's national ID Aadhaar program, which relies on a database that has collected the iris scans and fingerprints of more than 1 billion Indians in a country where hundreds of millions previously lacked state identity cards. Or the Londoners kept safe by massive numbers of CCTV cameras. Or the predictive policing pilot launched in New Orleans with the pro-bono help of Palantir. Even in democracies with meaningful legal checks on state power, leveraging A.I. for policing often suffers from a lack of transparency, citizen input, and a serious risk of biased enforcement and overreach. China and a few small, advanced, authoritarian states such as Singapore (which is soliciting Chinese bids to install 110,000 advanced facial recognition sensors on the small city-state's lampposts) and the United Arab Emirates are at the forefront of the application of these technologies. But as China embarks on a trillion-dollar global infrastructure construction binge known as the Belt and Road Initiative, it is already exporting its own tech-enabled authoritarian toolkit to gain profit or goodwill with local authorities, or simply to extend the reach of its own surveillance. What happens when these technologies migrate to bigger, more fractious societies? This won't happen overnight—and the financial and logistical obstacles to broad implementation are significant. But there is every reason to think that in a decade or two, if not sooner, authoritarians and would-be strongmen in places like Turkey, Hungary, Egypt, or Rwanda will seek these tools and use them to thwart civil society and crush dissent in ways that weaken democracy globally. Already there are reports that Zimbabwe, for example, is turning to Chinese firms to implement nationwide facial-recognition and surveillance programs, wrapped into China's infrastructure investments and a larger set of security agreements as well, including for policing online communication. The acquisition of black African faces will help China's tech sector improve its overall data set. Malaysia, too, announced new partnerships this spring with China to equip police with wearable facial-recognition cameras. There are quiet reports of Arab Gulf countries turning to China not just for the drone technologies America has denied but also for the authoritarian suite of surveillance, recognition, and data tools perfected in China's provinces. In a recent article on Egypt's military-led efforts to build a new capital city beyond Cairo's chaos and revolutionary squares, a retired general acting as project spokesman declared, "a smart city means a safe city, with cameras and sensors everywhere. There will be a command center to control the entire city." Who is financing construction? China. While many governments are making attempts to secure this information, there have been several alarming stories of data leaks. Moreover, these national identifiers create an unprecedented opportunity for state surveillance at scale. What about collecting biometric information in nondemocratic regimes? In 2016, the personal details of nearly 50 million people in Turkey were leaked. Now is the time for those invested in individual freedom—in government, in civil society, and in the tech sector—to be thinking about the challenges ahead. This starts with basic transparency and awareness at home, in international fora, and ultimately inside nations deciding how and whether to adopt the tools of algorithmic authoritarianism. Diplomats, CEOs, activists, and others will need to use their various bully pulpits to reach members of the public. Oversight bodies like the U.S. Congress and European Parliament should convene hearings to hold tech companies and government agencies accountable for their role in exporting elements of the authoritarian toolkit in search of profits or market share. Forging reasonable, balanced approaches to these new technologies at home will be a crucial aspect of pushing other states to do the same. As a recent blog post from Microsoft's President Brad Smith essentially calling for intensive study and regulation of this space makes clear, now is the time to bridge the tech-policy divide to find feasible, ethical solutions. Reaching beyond established democracies to set international norms and standards will be difficult, but it is essential to try. An international body, be it the European Union or United Nations, will need to put forward a set of best practices for protecting individual rights in an era of facial recognition. Companies and countries alike can group together to commit to protecting citizens by placing limits on facial recognition, offering the right in some instances to opt out of sharing biologically identifiable information, as Indians fought for and won, or protecting identifying data on the back end. China and other determined authoritarian states may prove undeterrable in their zeal to adopt repressive technologies. A more realistic goal, as Georgetown University scholar Nicholas Wright has argued, is to sway countries on the fence by pointing out the reputational costs of repression and supporting those who are advocating for civil liberties in this domain within their own countries. Democracy promoters (which we hope will one day again include the White House) will also want to recognize the coming changes to the authoritarian public sphere. They can start now in helping vulnerable populations and civil society to gain greater technological literacy to advocate for their rights in new domains. It is not too early for governments and civil society groups alike to study what technological and tactical countermeasures exist to circumvent and disrupt new authoritarian tools. Everyone will have to approach these developments with the humbling recognition that Silicon Valley is not the only game in town. Regardless of what happens stateside or in Europe, China will have formidable and growing indigenous capabilities to export to the rest of the world. Seven years ago, techno-optimists expressed hope that a wave of new digital tools for social networking and self-expression could help young people in the Middle East and elsewhere to find their voices. Today, a new wave of Chinese-led technological advances threatens to blossom into what we consider an "Arab spring in reverse"—in which the next digital wave shifts the pendulum back, enabling state domination and repression at a staggering scale and algorithmic effectiveness. Americans are absolutely right to be urgently focused on countering Russian weaponized hacking and leaking as its primary beneficiary sits in the Oval Office. But we also need to be more proactive in countering the tools of algorithmic authoritarianism that will shape the worldwide future of individual freedom."

The entire story can be read at the link below:  

https://slate.com/technology/2018/08/chinas-export-of-cutting-edge-surveillance-and facial-recognition-technology-will-empower-authoritarians-worldwide.htm


Programming code abstract screen of software developer.


 PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. 

Harold Levy: Publisher; The Charles Smith Blog;

---------------------------------------------------------------------

Thursday, August 16, 2018

Massachusetts: Unreliable breathalyzer results; District attorneys have agreed to toss breath test results in thousands of drunken driving cases..."The breath test controversy comes on the heels of scandals in the state crime lab involving chemists Anne Dookhan and Sonja Farak, who in unrelated cases were found to have tainted or tampered with thousands of drug samples at their respective labs. Their conduct resulted the dismissal of nearly 29,000 drug cases." (MassLive: Reporter Patrick Johnson);


PUBLISHER'S NOTE: Support freedom of the press: August 16, 2018. A day set aside by my American counterparts to  denounce the onslaught on freedom of the press launched by President Trump and The Republican Party. It is painful to see my  heroes and role-models  - the watchdogs over abuse of power - subjected to such degradation by none other than the President of the U.S.A.  The actions of Donald Trump reinforce the importance of an independent press in a democracy  -  and the need to hold politicians accountable to the public. Trump repeatedly refers to journalists as "enemies of the people." Perhaps if he looked in the mirror he would discover who the enemy of the people really is."

Harold Levy: Publisher of the Charles Smith Blog;

-------------------------------------------------------------

PASSAGE OF THE DAY: "The move is part of an agreement between the DAs and lawyers Thomas Workman of Taunton and Joseph Bernard of Springfield, who are challenging the validity of breath tests as evidence in drunken driving cases. Questions about the reliability of results from any Draeger 9510 breath test machine, used by more than 400 state and local law enforcement agencies statewide, have already led to a 2017 court order excluding the results in more than 19,000 drunken driving cases between June 1, 2012 and Sept. 14, 2014. The new agreement expands that window for excluding breath test results by nearly another three years. It was submitted to the judge in the case, Concord District Court Justice Robert Brennan, on Tuesday, and he must still decide to accept it. Brennan ruled Feb. 17, 2017 that while the Draeger 9510 machines are reliable, the state's protocols for calibrating them were careless. He ruled test results from within that window could still be used, but only on a case-by-case basis when prosecutors could demonstrate the tests were done on a certified machine that had been properly calibrated. The agreement results from a controversy that arose last year when it was determined the state Office of Alcohol Testing, the agency within the State Crime Lab that oversees breath testing technology, failed to submit to the court some 400 documents detailing problems with calibration of the devices. The head of the lab at that time has since been fired. "We learned the state withheld evidence," Workman said. "That's a very bad thing to do in court."

-----------------------------------------------------------

STORY: "Massachusetts district attorneys agree to toss breath test results in thousands of drunken driving cases," by reporter Patrick Johnson, published by MassLive on August 15, 2018.

GIST: "Breathalyzer results could be tossed out as evidence in thousands of drunken driving prosecutions as part of an agreement between all of the state's district attorneys and the defense lawyers in a long-running case challenging the reliability of the testing devices. Each of the state's 11 district attorneys have agreed not to use breath test results in drunken driving prosecutions for arrests before Aug. 31, 2017. The only exceptions are cases involving death or serious injury, or anyone facing charges for a fifth offense or higher. The move is part of an agreement between the DAs and lawyers Thomas Workman of Taunton and Joseph Bernard of Springfield, who are challenging the validity of breath tests as evidence in drunken driving cases. Questions about the reliability of results from any Draeger 9510 breath test machine, used by more than 400 state and local law enforcement agencies statewide, have already led to a 2017 court order excluding the results in more than 19,000 drunken driving cases between June 1, 2012 and Sept. 14, 2014. The new agreement expands that window for excluding breath test results by nearly another three years. It was submitted to the judge in the case, Concord District Court Justice Robert Brennan, on Tuesday, and he must still decide to accept it. Brennan ruled Feb. 17, 2017 that while the Draeger 9510 machines are reliable, the state's protocols for calibrating them were careless. He ruled test results from within that window could still be used, but only on a case-by-case basis when prosecutors could demonstrate the tests were done on a certified machine that had been properly calibrated. The agreement results from a controversy that arose last year when it was determined the state Office of Alcohol Testing, the agency within the State Crime Lab that oversees breath testing technology, failed to submit to the court some 400 documents detailing problems with calibration of the devices. The head of the lab at that time has since been fired. "We learned the state withheld evidence," Workman said. "That's a very bad thing to do in court." Workman said the 2017 order covered 19,000 cases. The DAs agreeing to expand the exclusion window by nearly three years means the number of affected cases will increase to around 36,000. And if the judge agrees to the defense attorneys' request to extend the window further, the number of cases could expand to well over 40,000, Workman said. Through the end of 2017, the number of drunken driving cases involving a breath test totaled more than 39,000. He and Bernard are requesting that Brennan not allow the use of breath tests as evidence until the Office of Alcohol Testing applies for and obtains a national accreditation. They say the office is the only part of the state crime lab that is not accredited. "To correct the deficiencies that exist, the (Office of Alcohol Testing) must become accredited," Bernard said. The agreement calls for the office to apply for accreditation by next August. Brennan has yet to make a decision on the matter, and a hearing is scheduled next week. The breath test controversy comes on the heels of scandals in the state crime lab involving chemists Anne Dookhan and Sonja Farak, who in unrelated cases were found to have tainted or tampered with thousands of drug samples at their respective labs. Their conduct resulted the dismissal of nearly 29,000 drug cases. Excluding breath test results does not mean the state has to dismiss the charges in related drunken driving cases. Prosecutors can still introduce other evidence such as results of a failed field sobriety test, or the officer's observations that a driver had trouble standing, had glassy eyes or smelled of alcohol. Bernard said problems with the testing and with the crime lab are unacceptable. "Our justice system and public deserve more," he said, adding that defendants in drunken driving cases can go to jail, lose their jobs and lose their right to drive. "The Office of Alcohol Testing is directly responsible for insuring that breath tests provide accurate and precise results," he said. "To be trusted by the public, the evidence used in the justice system has to be correct. If not, then the trial is not fair." Workman said that until the office is accredited, questions will hang over all breath test results introduced at trial. Until the problems arose, any test showing a blood-alcohol reading of .08 percent -- the legal definition of intoxication in Massachusetts -- was a legal slam dunk for prosecutors. "The public and juries wants to believe the machines," he said. "But what do you do when the machine is wrong or has not been calibrated correctly?" The Massachusetts Executive Office of Public Safety and Security on Wednesday issued a statement on the matter. It also sent a letter to each district attorney. "The integrity and accuracy of breath test instruments in use across the Commonwealth at this time has never been determined to be an issue in this case and we stand behind these instruments' ability to accurately determine the breath alcohol level of drivers charged with operating under the influence, " it said. "The Office of Alcohol Testing has been working diligently to improve transparency by increasing the availability and accuracy of documents and data in its possession." Law Enforcement continues to use the breath tests in cases of suspected drunken driving, and the Office of Alcohol Testing continues to certify them. "We maintain full confidence in the integrity and scientific reliability of these instruments as well as the Office of Alcohol Testing's ability to certify them," the statement read. A statement by Berkshire District Attorney Paul Caccaviello said prosecutors have agreed to allow the court to determine the period of time in which "breath test results are not automatically admissible" in OUI prosecutions. "This is a mutual effort to resolve the litigation that has delayed the criminal trials of numerous OUI defendants throughout the commonwealth," he said. Hampden District Attorney Anthony Gulluni could not be reached for comment. Northwestern District Attorney David Sullivan declined comment. His office referred questions to the Suffolk District Attorney's Office, which is more involved in the case. Suffolk County Assistant District Attorney Vincent DeMore told WBUR in Boston on Tuesday that the proposed agreement requires the Office of Alcohol Testing to become accredited and should resolve concerns about the accuracy of testing equipment. "I think far from it being a situation that should shake the confidence of the public, it should be an area where we should have greater confidence in the reliability of the instrument," he said."

The entire story can be found at:
https://www.masslive.com/news/index.ssf/2018/08/mass_das_agreement_to_exclude.html

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. 
Harold Levy: Publisher; The Charles Smith Blog;
---------------------------------------------------------------------

Technology Series: (Part 9): Engineering prof. Cherri Pancake (Professor Emeritus of Electrical Engineering & Computer Science, Oregon State University) says programmers need ethics when designing the technologies that influence people’s lives ..."More and more software is being developed to run with little or no input or human understanding, producing analytical results to guide decision-making, such as when to approve bank loans. The outputs can have completely unintended social effects, skewed against whole classes of people – like recent cases where data-mining predictions of who would default on a loan showed biases against people who seek longer-term loans or live in particular areas. There are also dangers of what are called “false positives,” when a computer links two things that shouldn’t be connected – as when facial recognition software recently matched members of Congress to criminals’ mug shots."


algorithm


PUBLISHER'S NOTE: Support freedom of the press: August 16, 2018. A day set aside by my American counterparts to  denounce the onslaught on freedom of the press launched by President Trump and The Republican Party. It is painful to see my  heroes and role-models  - the watchdogs over abuse of power - subjected to such degradation by none other than the President of the U.S.A.  The actions of Donald Trump reinforce the importance of an independent press in a democracy  -  and the need to hold politicians accountable to the public. Trump repeatedly refers to journalists as "enemies of the people." Perhaps if he looked in the mirror he would discover who the enemy of the people really is."

Harold Levy: Publisher of the Charles Smith Blog;

-------------------------------------------------------------

PUBLISHER'S NOTE: Artificial intelligence, once the stuff of science fiction, has become all to real in our modern society - especially in the American criminal justice system; As the ACLU's  Lee Rowland puts it:  "Today, artificial intelligence. It's everywhere — in our homes, in our cars, our offices, and of course online. So maybe it should come as no surprise that government decisions are also being outsourced to computer code. In one Pennsylvania county, for example, child and family services uses digital tools to assess the likelihood that a child is at risk of abuse. Los Angeles contracts with the data giant Palantir to engage in predictive policing, in which algorithms identify residents who might commit future crimes. Local police departments are buying Amazon's facial recognition tool, which can automatically identify people as they go about their lives in public."  The algorithm is finding its place deeper and deeper in the nation's courtrooms on what used to be  exclusive decisions of judges such as bail and even the sentence to be imposed. I am pleased to see that a dialogue has begun on the effect that increasing use of these logarithms in our criminal justice systems is having on our society and on the quality of decision-making inside courtrooms. As Lee Rowland asks about this brave new world,  "What does all this mean for our civil liberties and how do we exercise oversight of an algorithm?" In view of the importance of these issues - and  the increasing use of artificial intelligence by countries for surveillance  of their citizens - it's time for yet another technology series on The Charles Smith Blog focusing on the impact of science on society and  criminal justice. Up to now I have been identifying the appearance of these technologies. Now at last I can report on the realization that some of them may be two-edged swords - and on growing  pushback.

Harold Levy: Publisher; The Charles Smith Blog:

------------------------------------------------------------

PASSAGE OF THE DAY: "Computing professionals are on the front lines of almost every aspect of the modern world. They’re involved in the response when hackers steal the personal information of hundreds of thousands of people from a large corporation. Their work can protect – or jeopardize – critical infrastructure like electrical grids and transportation lines. And the algorithms they write may determine who gets a job, who is approved for a bank loan or who gets released on bail. Technological professionals are the first, and last, lines of defense against the misuse of technology. Nobody else understands the systems as well, and nobody else is in a position to protect specific data elements or ensure the connections between one component and another are appropriate, safe and reliable. As the role of computing continues its decades-long expansion in society, computer scientists are central to what happens next. That’s why the world’s largest organization of computer scientists and engineers, the Association for Computing Machinery, of which I am president, has issued a new code of ethics for computing professionals."

COMMENTARY: "Programmers need ethics when designing the technologies that influence people’s lives," by Cherri M. Pancake, published by The Conversation on August 8, 2018. (Professor Emeritus of Electrical Engineering & Computer Science, Oregon State University);

GIST:  "Computing professionals are on the front lines of almost every aspect of the modern world. They’re involved in the response when hackers steal the personal information of hundreds of thousands of people from a large corporation. Their work can protect – or jeopardize – critical infrastructure like electrical grids and transportation lines. And the algorithms they write may determine who gets a job, who is approved for a bank loan or who gets released on bail. Technological professionals are the first, and last, lines of defense against the misuse of technology. Nobody else understands the systems as well, and nobody else is in a position to protect specific data elements or ensure the connections between one component and another are appropriate, safe and reliable. As the role of computing continues its decades-long expansion in society, computer scientists are central to what happens next. That’s why the world’s largest organization of computer scientists and engineers, the Association for Computing Machinery, of which I am president, has issued a new code of ethics for computing professionals. And it’s why ACM is taking other steps to help technologists engage with ethical questions. Serving the public interest: A code of ethics is more than just a document on paper. There are hundreds of examples of the core values and standards to which every member of a field is held – including for organist guilds and outdoor advertising associations. The world’s oldest code of ethics is also its most famous: the Hippocratic oath medical doctors take, promising to care responsibly for their patients. I suspect that one reason for the Hippocratic oath’s fame is how personal medical treatment can be, with people’s lives hanging in the balance. It’s important for patients to feel confident their medical caregivers have their interests firmly in mind. Technology is, in many ways, similarly personal. In modern society computers, software and digital data are everywhere. They’re visible in laptops and smartphones, social media and video conferencing, but they’re also hidden inside the devices that help manage people’s daily lives, from thermostats to timers on coffee makers. New developments in autonomous vehicles, sensor networks and machine learning mean computing will play an even more central role in everyday life in coming years. A changing profession: As the creators of these technologies, computing professionals have helped usher in the new and richly vibrant rhythms of modern life. But as computers become increasingly interwoven into the fabric of life, we in the profession must personally recommit to serving society through ethical conduct. ACM’s last code of ethics was adopted in 1992, when many people saw computing work as purely technical. The internet was in its infancy and people were just beginning to understand the value of being able to aggregate and distribute information widely. It would still be years before artificial intelligence and machine learning had applications outside research labs. Today, technologists’ work can affect the lives and livelihoods of people in ways that may be unintended, even unpredictable. I’m not an ethicist by training, but it’s clear to me that anyone in today’s computing field can benefit from guidance on ethical thinking and behavior. Updates to the code: ACM’s new ethics code has several important differences from the 1992 version. One has to do with unintended consequences. In the 1970s and 1980s, technologists built software or systems whose effects were limited to specific locations or circumstances. But over the past two decades, it has become clear that as technologies evolve, they can be applied in contexts very different from the original intent. For example, computer vision research has led to ways of creating 3D models of objects – and people – based on 2D images, but it was never intended to be used in conjunction with machine learning in surveillance or drone applications. The old ethics code asked software developers to be sure a program would actually do what they said it would. The new version also exhorts developers to explicitly evaluate their work to identify potentially harmful side effects or potential for misuse. Another example has to do with human interaction. In 1992, most software was being developed by trained programmers to run operating systems, databases and other basic computing functions. Today, many applications rely on user interfaces to interact directly with a potentially vast number of people. The updated code of ethics includes more detailed considerations about the needs and sensitivities of very diverse potential users – including discussing discrimination, exclusion and harassment. The revised code exhorts technologists to take special care to avoid creating systems with the potential to oppress or disenfranchise whole groups of people. Living ethics in technology: The code was revised over the course of more than two years, including ACM members and people outside the organization and even outside the computing and technological professions. All these perspectives made the code better. For example, a government-employed weapons designer asked whether that job inherently required violating the code; the wording was changed to clarify that systems must be “consistent with the public good.” Now that the code is out, there’s more to do. ACM has created a repository for case studies showing how ethical thinking and the guidelines can be applied in a variety of real-world situations. The group’s “Ask An Ethicist” blog and video series invites the public to submit scenarios or quandaries as they arise in practice. Word is also underway to develop teaching modules so the concepts can be integrated into computing education from primary school through university. Feedback has been overwhelmingly positive. My personal favorite was the comment from a young programmer after reading the code: “Now I know what to tell my boss if he asks me to do something like that again.” The ACM Code of Ethics and Professional Conduct begins with the statement, “Computing professionals’ actions change the world.” We don’t know if our code will last as long as the Hippocratic oath. But it highlights how important it is that the global computing community understands the impact our work has – and takes seriously our obligation to the public good."


Programming code abstract screen of software developer.

 PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. 

Harold Levy: Publisher; The Charles Smith Blog;

---------------------------------------------------------------------