PASSAGE OF THE DAY: "Each day, the area’s hotline receives dozens of calls from people who suspect that a child is in danger; some of these are then flagged by call-centre staff for investigation. But the system does not catch all cases of abuse. Vaithianathan and her colleagues had just won a half-million-dollar contract to build an algorithm to help. After Vaithianathan invited questions from her audience, the father stood up to speak. He had struggled with drug addiction, he said, and social workers had removed a child from his home in the past. But he had been clean for some time. With a computer assessing his records, would the effort he’d made to turn his life around count for nothing? In other words: would algorithms judge him unfairly?"
-----------------------------------------------------------------
QUOTE OF THE DAY: "What these tools often do is help us tinker around the edges, but what we need is wholesale change,” says Vincent Southerland, a civil-rights lawyer and racial-justice advocate at New York University’s law school. That said, the robust debate around algorithms, he says, “forces us all to ask and answer these really tough fundamental questions about the systems that we’re working with and the ways in which they operate.
-----------------------------------------------------------------
STORY: "Bias detectives: the researchers striving to make algorithms fair," a news feature by reporter Rachel Courtland, published by nature.com on June 20, 2018.
SUB-HEADING: "As machine learning infiltrates society, scientists are trying to help ward off injustice."
GIST: "In 2015, a worried father asked Rhema Vaithianathan a
question that still weighs on her mind. A small crowd had gathered in a
basement room in Pittsburgh, Pennsylvania, to hear her explain how
software might tackle child abuse. Each day, the area’s hotline receives
dozens of calls from people who suspect that a child is in danger; some
of these are then flagged by call-centre staff for investigation. But
the system does not catch all cases of abuse. Vaithianathan and her
colleagues had just won a half-million-dollar contract to build an
algorithm to help. Vaithianathan, a health economist who
co-directs the Centre for Social Data Analytics at the Auckland
University of Technology in New Zealand, told the crowd how the
algorithm might work. For example, a tool trained on reams of data —
including family backgrounds and criminal records — could generate risk
scores when calls come in. That could help call screeners to flag which
families to investigate. After Vaithianathan invited questions
from her audience, the father stood up to speak. He had struggled with
drug addiction, he said, and social workers had removed a child from his
home in the past. But he had been clean for some time. With a computer
assessing his records, would the effort he’d made to turn his life
around count for nothing? In other words: would algorithms judge him
unfairly? Vaithianathan assured him that a human would always be in
the loop, so his efforts would not be overlooked. But now that the
automated tool has been deployed, she still thinks about his question.
Computer calculations are increasingly being used to steer potentially
life-changing decisions, including which people to detain after they
have been charged with a crime; which families to investigate for
potential child abuse, and — in a trend called ‘predictive policing’ —
which neighbourhoods police should focus on. These tools promise to make
decisions more consistent, accurate and rigorous. But oversight is
limited: no one knows how many are in use. And their potential for
unfairness is raising alarm. In 2016, for instance, US journalists
argued that a system used to assess the risk of future criminal activity
discriminates against black defendants. “What concerns me most is
the idea that we’re coming up with systems that are supposed to
ameliorate problems [but] that might end up exacerbating them,” says
Kate Crawford, co-founder of the AI Now Institute, a research centre at
New York University that studies the social implications of artificial
intelligence..........Some
researchers are already calling for a step back, in criminal-justice
applications and other areas, from a narrow focus on building algorithms
that make forecasts. A tool might be good at predicting who will fail
to appear in court, for example. But it might be better to ask why
people don’t appear and, perhaps, to devise interventions, such as text
reminders or transportation assistance, that might improve appearance
rates. “What these tools often do is help us tinker around the edges,
but what we need is wholesale change,” says Vincent Southerland, a
civil-rights lawyer and racial-justice advocate at New York University’s
law school. That said, the robust debate around algorithms, he says,
“forces us all to ask and answer these really tough fundamental
questions about the systems that we’re working with and the ways in
which they operate”. Vaithianathan, who is now in the process of
extending her child-abuse prediction model to Douglas and Larimer
counties in Colorado, sees value in building better algorithms, even if
the overarching system they are embedded in is flawed. That said,
“algorithms can’t be helicopter-dropped into these complex systems”, she
says: they must be implemented with the help of people who understand
the wider context. But even the best efforts will face challenges, so in
the absence of straight answers and perfect solutions, she says,
transparency is the best policy. “I always say: if you can’t be right,
be honest.”"
The entire news feature can be read at:
https://www.nature.com/articles/d41586-018-05469-3
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/c harlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot. com/2011/05/charles-smith-blog -award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. Harold Levy; Publisher; The Charles Smith Blog;
The entire news feature can be read at:
https://www.nature.com/articles/d41586-018-05469-3
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/c