Tuesday, November 10, 2015

Predictive policing: (and predictive prison sentencing): Shaun King calls it "'technological racism"...."The NYPD uses technology that apparently predicts where crimes will happen based on prior arrest data."...New York Daily News story takes us even further into the future as it notes that, "in addition to predictive policing, the state of Pennsylvania is pioneering predictive prison sentencing."


"The future is here. For years now, the NYPD, the Miami PD, and many police departments around the country have been using new technology that claims it can predict where crime will happen and where police should focus their energies in order. They call it predictive policing. Months ago, I raised several red flags to such software because it does not appear to properly account for the presence of racism or racial profiling in how it predicts where crimes will be committed.
See, these systems claim to predict where crimes will happen based on prior arrest data. What they don't account for is the widespread reality that race and racial profiling have everything to do with who is arrested and where they are arrested. For instance, study after study has shown that white people actually are more likely to sell drugs and do drugs than black people, but are exponentially less likely to be arrested for either crime. But, and this is where these systems fail, if the only data being entered into systems is based not on the more complex reality of who sells and purchases drugs, but on a racial stereotype, then the system will only perpetuate the racism that preceded it. Predictive policing: New Plan; Same old NYPD:  In essence, it's not predicting who will sell drugs and where they will sell it, as much as it is actually predicting where a certain race of people may sell or purchase drugs. It's technological racism at its finest. Now, in addition to predictive policing, the state of Pennsylvania is pioneering predictive prison sentencing. Through complex questionnaires and surveys completed not by inmates, but by prison staff members, inmates may be given a smaller bail or shorter sentences or a higher bail and lengthier prison sentences. The surveys focus on family background, economic background, prior crimes, education levels and more. When all of the data is scored, the result classifies prisoners as low, medium or high risk. While this may sound benign, it isn't. No prisoner should ever be given a harsh sentence or an outrageous bail amount because of their family background or economic status. Even these surveys lend themselves to being racist and putting black and brown women and men in positions where it's nearly impossible to get a good score because of prevalent problems in communities of color.
http://www.nydailynews.com/new-york/king-predictive-policing-technological-racism-article-1.2425028