Are California Police Departments Quietly Backing Away from Predictive Policing?

3 min
A police escort on July 22, 2007. (Sgrace / Flickr)

Nearly a decade ago, Santa Cruz was one of the first cities in the U.S. to adopt what's called "predictive policing." Now it's one of the first to enact a ban. That may be because police departments are beginning to quietly back away from the practice.

In truth, the Santa Cruz Police Department dropped predictive policing in 2017, when Andy Mills started as police chief. From the beginning, he spoke of a more community focused approach to the job. Interviewed for a story about his first day on the job, he told local TV news station KSBW he wanted to focus on "tactical deescalation." He was one of a number of California police chiefs to take a knee with Black Lives Matter protestors this past spring.

When the Santa Cruz city council took up a new ordinance banning predictive policing and facial recognition software on June 23, 2020, Chief Mills expressed unreserved support. "Predictive policing has been shown over time to put officers in conflict with communities rather than working with the communities," he said. He spoke in support of the mayor to ban the technologies "until such time that it can be peer reviewed and scientifically proven." The vote for the ban was unanimous.

Among those watching from elsewhere in the Bay was Brian Hofer, who chairs Oakland's privacy commission and heads the nonprofit Secure Justice. He sees the vote in Santa Cruz as the latest referendum on the effectiveness of predictive policing.

Sponsored

"You guys told us this was going to save lives. Where's the data?" he said, noting conversations bubbling up in Oakland and San Diego that may evolve into efforts to ban predictive policing elsewhere in California. Hofer helped craft Oakland's ban on facial recognition technology, and consults with other cities reassessing their policing contracts and policies.

Santa Cruz Mayor Justin Cummings proposed the ban on predictive policing, unanimously approved by the city council on June 23, 2020.
Santa Cruz Mayor Justin Cummings proposed the ban on predictive policing, unanimously approved by the city council on June 23, 2020. (Courtesy of the City of Santa Cruz)

"We fall for the marketing hype, go release this stuff out into the wild without understanding it, and then never really demand a cost benefit analysis," Hofer said, noting a number of studies and audits in recent years have found predictive policing and other data-driven policing solutions sold by Silicon Valley companies like PredPol and Palantir have yet to deliver on their promises.

Predictive policing uses computer modeling to anticipate crime and to manage when and where police officers are deployed. Algorithms analyze historical data to predict where certain crimes may occur (aka "hotspots") and possibly who may be involved in a future crime. That, at least, is the promise.

“It is a common fallacy that police data is objective and reflects actual criminal behavior, patterns, or other indicators of concern to public safety in a given jurisdiction,” researchers from New York University wrote in a 2019 study they titled Dirty Data, Bad Predictions. “In reality, police data reflects the practices, policies, biases, and political and financial accounting needs of a given department,” they wrote.

Santa Cruz Mayor Justin Cummings referenced this consensus growing among academics as well as activists when he brought the ban proposal forward in June. "If policing itself is biased, then the data that's informing those models will be biased," he said.

Not Just Santa Cruz

Former LAPD Chief and New York City Police Commissioner Bill Bratton was the biggest early booster of predictive policing on the national stage, starting in 1994. First, with the crime tracking system CompStat, later with a host of analytical software products and strategies, he's championed what he describes as the "evolution of policing."

Bratton was brought in first to consult and later to head the LAPD in 2002 by then-Mayor Jim Hahn. A decade later, he led an investigation into Oakland's implementation of CompStat, and issued a scathing critique in 2012.

But while his data-driven reforms were praised at the time, the NYPD has since been court-ordered to produce records about testing, development, and use of predictive analytics tools, and just a few months ago, the LAPD pulled out of two of its predictive policing contracts, including one with PredPol, the same company Santa Cruz was using.

LAPD publicly blamed pandemic era budget restrictions for ending the contract, but as Muckrock reported, internal auditing couldn’t determine the programs were effective. The 48-page report from the office of Inspector General Mark Smith issued last March, found the department needs tougher standards for data collection, record keeping, and communicating its policies to the public to guard against targeting minorities and certain neighborhoods.

"After ten years, with enough public pressure and community concern, it was an easy decision to just pull the plug," said Andrew Ferguson, a law professor at American University and author of The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement.

Whether one takes issue with data-driven policing itself or the way it's implemented, there have been a host of issues with the way many California police departments have gathered and maintained databases.

A 2016 state audit of California’s CalGang database, for instance, found a multitude of errors, unsubstantiated claims of gang involvement, and hundreds of files lingering long after their legally mandated purge date.

In a statement, PredPol CEO Brian Macdonald took issue with the way predictive policing has been characterized in recent debates. "It should be noted that the 'predictive' aspect is only a small part of what we do," Macdonald said. "PredPol is all about bringing greater transparency to patrol operations... We allow departments to set missions — crime types to focus on — so the community can see what police resources are focused on."

Macdonald clarified what they do by saying they never use arrest data in predictions, and "never predict for crime types that allow for the possibility of officer bias (e.g. drug crimes)." He said they only work with crimes involving "a clear victim" such as vehicle theft, break-ins, and robberies. "We never use any demographic, racial, or personally identifiable information," he added.

But if appraisals on predictive policing are popping up on city council agendas across California this summer, Andrew Ferguson at American University warns many police officers, politicians and members of the public are still fascinated with the idea that technology is the ticket to 21st century crime fighting, and he has this prediction for us.

Sponsored

"The next thing we're going to see is a response to this demand for police accountability to sort of turn the surveillance gaze on police," Ferguson said. "It's sort of the Silicon Valley way to see an opening and try to pitch it."