STORY: "The futures of many prison inmates depend on racially biased algorithms," by reporter Selena Larson, published by the Daily Dot on May 23, 2016.
GIST: "It sounds like something out of
Minority Report: software
predicting the likelihood of people committing crimes in the future and
assigning people scores that judges and cops use to determine sentences
and bond payments But these algorithms are real and widely used in the United States—and according to
a new ProPublica report, this software is biased against African Americans. The
scores and data produced by risk-and-needs assessment tools like the
Correctional Offender Management Profiling for Alternative Sanctions
(COMPAS), which ProPublica investigated, are based on
a series of questions
that offenders answer as they move through the criminal-justice system.
(In some cases, the data also come from their arrest records.) There
are no questions about race, but the surveys include inquiries like
"How many of your friends/acquaintances have ever been arrested?", "Do
you have a regular living situation?", and "How often did you have
conflicts with teachers at school?" A
computer program analyzes the survey results and assigns a score to
each offender that represents the likelihood of them committing a future
crime. As ProPublica reported, offenders don't get an explanation of
how their scores are determined, even though judges and cops rely on
them—or at least take them into account—when making important decisions
about offenders' fates. ProPublica analyzed 10,000 criminal
defendants and compared their scores to their actual recidivism rates
over a two-year period. The publication found that black defendants were
regularly assigned higher risk scores than were warranted and that
black defendants who did not commit a crime after two years were twice
as likely to be misclassified as higher-risk than were white defendants. The
tool also underestimated white defendants' recidivism rates and
mistakenly labelled them as lower-risk twice as often as black
recidivists. Other
findings include:
The
analysis also showed that even when controlling for prior crimes,
future recidivism, age, and gender, black defendants were 45 percent
more likely to be assigned higher risk scores than white defendants; Black
defendants were also twice as likely as white defendants to be
misclassified as being a higher risk of violent recidivism. And white
violent recidivists were 63 percent more likely to have been
misclassified as a low risk of violent recidivism, compared with black
violent recidivists. The
violent recidivism analysis also showed that even when controlling for
prior crimes, future recidivism, age, and gender, black defendants were
77 percent more likely to be assigned higher risk scores than white
defendants. University
of Michigan law professor Sonja B. Starr, who has studied the use of
algorithmic-based risk assessments, said the surveys can adversel impact low-income offenders. "They are about the defendant's
family, the defendant's demographics, about socio-economic factors the
defendant presumably would change if he could: Employment, stability,
poverty," Starr
told the Associated Press in 2015. "It's basically an explicit embrace of the state saying we should sentence people differently based on poverty." Algorithms shape everything from our
Facebook
feeds to the ads we see online to prison sentences, so it's natural
that questions are arising about whether and how they are biased.........ProPublica's report
is a reminder that predictive algorithms are not "neutral" or "fair"
simply because they're software. And because the companies that make
them don't disclose their secret sauce, it's impossible to know how the
programs generate their results.
The entire story can be found at: