"When reporters shuffled into a
room on the seventh floor of Seattle Police Department headquarters on
October 7, we had no idea what to expect. Once inside, we whirled around
to get our bearings. A giant screen covering the closest wall flashed
with moving dots on a map and colored gauges, across from rows of
computer terminals.What looks like a miniature version of NASA's mission control room,
it turns out, is Seattle's new information hub for "agile policing"—the
Real Time Crime Center (RTCC), funded by a $411,000 grant from the
Department of Justice, the same agency that forced the department into a
federally monitored reform process in 2012 to address concerns about
racial bias and excessive force. The American Civil Liberties Union of Washington's director of
technology and liberty, Jared Friend, was watching a live stream of the
press conference as it unfolded. His ears perked up when he heard the
department's chief operating officer, Mike Wagers, point to one of the
screens and say, "It's trying to forecast the crime that we think will
occur. Wagers stopped abruptly and restarted: "I shouldn't say [that]...
It's forecasting what we believe, based on historical data, what crime
should be occurring out there. And then when we see the spike, [we]
redeploy as quickly as possible. Friend had already been in discussions with Wagers about the dangers
of so-called predictive policing, and he was alarmed to see the
department launching technology that might be "forecasting" crime. The
SPD has been forced to backtrack on new technologies in the face of
privacy concerns before: A few years ago, the department excitedly
launched a drone program, only to kill the program after an outcry. And
in 2013, it deactivated its citywide wireless mesh network after
The Stranger reported on ways the network could be used to conduct surveillance. The new crime center prompted Friend to take to the ACLU's website,
writing on October 20, "Although this may sound like a smart move to
incorporate analytics technology in law enforcement, in practice it
would perpetuate existing institutional racism in policing." He called
for "proper oversight and community input." Wagers responded curtly on Twitter: "Could not be further from the truth." So who's right?
First, some explaining: The primary goal of
the Real Time Crime Center is to enable faster responses to 911 calls.
That big screen on the wall shows where patrol cars currently are and
breaks down the number of active calls regarding different kinds of
crime at any one time. The department stresses that the particular
program that's so concerning to the ACLU accounts for only a fraction of
the center's activity. But how does that program work, exactly? According to Brandon Bouier, an SPD analyst and program manager, it
uses 911 call data going back to 2008 in order to create an expected
baseline of crimes. Take, for example, a hypothetical 10 car prowls per
week on Queen Anne. The software in the crime center would use that
baseline to detect anomalies by flagging unusually high or low levels of
crime—say, a spike to 20 car prowls in a week in that area. If such a
spike was detected, the crime center's staff would examine the anomaly
and decide how to deploy police in response, in real time. "The
anomalies give us something to focus on when we have these reams of
information," Bouier said. But human beings make the final call on
whether to act on them. "It's not like we have an automated system, like
Skynet"—the artificial intelligence overlord from the
Terminator films—"that's going to detect anomalies and automatically dispatch the officers."
The precise method for tuning those
baselines and calculating an anomaly isn't in place yet. In fact, the
program hasn't even launched yet, though the RTCC itself is
active—something that wasn't at all clear from the department's big
press conference. Bouier says the department is waiting on Via Science, a
Boston-based data analytics firm, to deliver the software that will
handle the crime number crunching......... Friend's worry is that the 911 call data used to set the crime
baselines isn't neutral. It's racially biased because institutional
racism is real; communities of color are historically less trustful of
police, leading them to report fewer crimes; and white communities often
over-report crimes. This could create a feedback loop in which certain
neighborhoods get singled out by the software as being more prone to
crime based on biased inputs, increased police presence is triggered for
those neighborhoods based on the software's statistical analysis, and
then those neighborhoods receive heightened policing that's based more
on prejudiced 911 callers than on a true representation of crime
patterns. I couldn't find any local data on racial bias in 911 calls, but
there's no disagreement that the calls are biased. To take one example:
Earlier this year, a purported police officer complained on a national
policing forum about a call he received of "suspicious activity." In
fact, the activity was two black men trying to jump-start their car. "If
you're going to be a racist, stereotypical jerk... keep it to
yourself," he wrote. The post went viral. On this point, it turns out, there's no daylight between the ACLU and
the SPD. In a meeting at the new crime center on November 3, Friend
laid out his concerns. And while Bouier and Friend vehemently disagreed
on whether the terminology of "prediction" or "forecasting" is
applicable to the new technology—Bouier doesn't believe it is—he agreed
that the data itself isn't objective. "We all in the department are
fully aware of the inherent bias in the data we're working with," he
said. "That said, that's all we have. We have to work with what we
have." (Bouier is black, and he said he's been profiled by police
before.) Friend came out of that meeting encouraged by the department's
openness. And Bouier pledged to drop Via Science as a partner if their
work isn't rigorous enough. Sean Whitcomb, a police department
spokesperson, wondered aloud whether it would be better to use incident
reports—which are based on officer investigations, instead of mere 911
calls—to create those baselines of expected crime levels. "The department needs to internalize that the data is inherently
problematic, and it needs to think about ways to correct for those
problems," Friend said on his way out from the meeting. "And if the
value of using this data isn't really strong, and it's outweighed by
those concerns [about bias], they need to decide not to move forward
with it. The good news, based on what I heard in the meeting, is that
they're willing to do that."
http://www.thestranger.com/news/feature/2015/11/04/23105146/the-problem-with-forecasting-crime