PUBLISHER'S NOTE: In recent years, I have found myself publishing more and more posts on the application of artificial intelligence technology to policing, public safety, and the criminal justice process, not just in North America, but in countries all over the world, including China. Although I accept that properly applied science can play a positive role in our society, I have learned over the years that technologies introduced for the so-called public good, can eventually be used against the people they were supposed to benefit. As reporter Sieeka Khan writes in Science Times: "In 2017, researchers sent a letter to the secretary of the US Department of Homeland Security. The researchers expressed their concerns about a proposal to use the AI to determine whether someone who is seeking refuge in the US would become a positive and contributing member of society or if they are likely to become a threat or a terrorist. The other government uses of AI are also being questioned, such as the attempts at setting bail amounts and sentences on criminals, predictive policing and hiring government workers. All of these attempts have been shown to be prone to technical issues and a limit on the data can cause bias on their decisions as they will base it on gender, race or cultural background. Other AI technologies like automated surveillance, facial recognition and mass data collection are raising concerns about privacy, security, accuracy and fairness in a democratic society. As the executive order of Trump demonstrates, there is a massive interest in harnessing AI for its full, positive potential. But the dangers of misuse, bias and abuse, whether it is intentional or not, have the chance to work against the principles of international democracies. As the use of artificial intelligence grows, the potential for misuse, bias and abuse grows as well. The purpose of this 'technology' series, is to highlight the dangers of artificial intelligence - and to help readers make their own assessments as to whether these innovations will do more harm than good."
Harold Levy: Publisher: The Charles Smith Blog.
----------------------------------------------------------
PASSAGE OF THE DAY: “The sharing of child-sex-abuse images is a serious crime, and law enforcement should be investigating it. But the government needs to understand how the tools work, if they could violate the law and if they are accurate,” said Sarah St.Vincent, a Human Rights Watch researcher who examined the practice. “These defendants are not very popular, but a dangerous precedent is a dangerous precedent that affects everyone. And if the government drops cases or some charges to avoid scrutiny of the software, that could prevent victims from getting justice consistently,” she said. “The government is effectively asserting sweeping surveillance powers but is then hiding from the courts what the software did and how it worked.”
------------------------------------------------------------
PASSAGE TWO OF THE DAY: "The government’s reluctance to share technology with defense attorneys isn’t limited to child pornography cases. Prosecutors have let defendants monitored with cellphone trackers known as Stingrays go free rather than fully reveal the technology. The secrecy surrounding cell tracking was once so pervasive in Baltimore that Maryland’s highest court rebuked the practice as “detrimental.” As was first reported by Reuters in 2013, the U.S. Drug Enforcement Administration relied in investigations on information gathered through domestic wiretaps, a phone-records database and National Security Agency intercepts, while training agents to hide those sources from the public record. Courts and police are increasingly using software to make decisions in the criminal justice system about bail, sentencing, and probability-matching for DNA and other forensic tests,” said Jennifer Granick, a surveillance and cybersecurity lawyer with the American Civil Liberties Union’s Speech, Privacy and Technology Project who has studied the issue. “If the defense isn’t able to examine these techniques, then we have to just take the government’s word for it — on these complicated, sensitive and non-black-and-white decisions. And that’s just too dangerous.”
------------------------------------------------------------
STORY: "Prosecutors Dropping Child Porn Charges After Software Tools Are Questioned," by reporter Jack Gillum, published by Pro Publica on April 3, 2019. ( Jack Gillum is a senior reporter at ProPublica covering technology, specializing in how algorithms, big data and social media platforms affect people’s daily lives and civil rights.)
SUB-HEADING: "More than a dozen cases were dismissed after defense attorneys asked to examine, or raised doubts about, computer programs that track illegal images to internet addresses."
GIST: (This is a just portion of a lengthy story. The rest is well worth reading at the link below. HL) "Using specialized software, investigators traced explicit child pornography to Todd Hartman’s internet address. A dozen police officers raided his Los Angeles-area apartment, seized his computer and arrested him for files including a video of a man ejaculating on a 7-year-old girl. But after his lawyer contended that the software tool inappropriately accessed Hartman’s private files, and asked to examine how it worked, prosecutors dismissed the case. Near Phoenix, police with a similar detection program tracked underage porn photos, including a 4-year-old with her legs spread, to Tom Tolworthy’s home computer. He was indicted in state court on 10 counts of committing a “dangerous crime against children,” each of which carried a decade in prison if convicted. Yet when investigators checked Tolworthy’s hard drive, the images weren’t there. Even though investigators said different offensive files surfaced on another computer that he owned, the case was tossed. At a time when at least half a million laptops, tablets, phones and other devices are viewing or sharing child pornography on the internet every month, software that tracks images to specific internet connections has become a vital tool for prosecutors. Increasingly, though, it’s backfiring. Drawing upon thousands of pages of court filings as well as interviews with lawyers and experts, ProPublica found more than a dozen cases since 2011 that were dismissed either because of challenges to the software’s findings, or the refusal by the government or the maker to share the computer programs with defense attorneys, or both. Tami Loehrs, a forensics expert who often testifies in child pornography cases, said she is aware of more than 60 cases in which the defense strategy has focused on the software. Defense attorneys have long complained that the government’s secrecy claims may hamstring suspects seeking to prove that the software wrongly identified them. But the growing success of their counterattack is also raising concerns that, by questioning the software used by investigators, some who trade in child pornography can avoid punishment.