upper waypoint

Report Warns A.I. Algorithms Not Quite Ready for Prime Time in Criminal Justice

02:11
Save ArticleSave Article
Failed to save article

Please try again

 (iStock)

Artificial intelligence is everywhere these days, including in the criminal justice system. But a new report out Friday joins a chorus of voices warning that the software isn’t ready for the task.

"You need to understand as you're deploying these tools that they're extremely approximate, extremely inaccurate," said Peter Eckersley, research director at Partnership on A.I., a consortium of Silicon Valley heavyweights and civil liberties groups that helped published the report. "And that if you think of them as 'Minority Report,' you've gotten it entirely wrong," he added, referencing the Steven Spielberg science fiction blockbuster from 2002 that's become a kind of shorthand for all allusions to predictive policing.

The study — "Algorithmic Risk Assessment Tools in the U.S. Criminal Justice System" — scrutinizes how A.I. is increasingly being used throughout the country.

Algorithmic software crunches data about an individual along with statistics about groups that person belongs to. What level of education did this individual attain? How many criminal offenses did this individual commit before the age of 18? What is the likelihood of, say, skipping bail for individuals who never finished high school and committed two crimes before the age of 18?

Sponsored

It can seem like the software bypasses human errors in assessing that risk. But the report homes in on the issue of machine learning bias: When humans feed biased or inaccurate information into software programs, making those systems biased as well.

An example (not mentioned in the report): The Stanford Open Policing Project recently reported that law enforcement officers nationwide tend to stop African-American drivers at higher rates than white drivers and to search, ticket and arrest African-American and Latino drivers during traffic stops more often than whites.

Any evaluation software that incorporates a data set like this on traffic stops could then potentially deliver racially biased recommendations, the Stanford researchers note, even if the software doesn't include racial data per se.

"Standards need to be set for these tools," Eckersley said. "And if you were to ever try to use them to decide to detain someone, the tools would need to meet those standards. And, unfortunately, none of the tools presently do."

Currently, 49 of 58 counties in California use some kind of algorithmic risk assessment tool for bail, sentencing and/or probation. And a new bill in the state Legislature would require all counties to use it for bail.

What is the Bail Reform Law?

Senate Bill 10 offers no guidance for how counties should calculate risk levels or protect against unintentional discriminatory outcomes. The Judicial Council, the policymaking body of the California courts, would approve tools if and when the law goes into effect, but would stop short of assessing results.

There are a number of risk assessment tools, most of which are not using artificial intelligence but instead, paper, pencil and a professional human's judgement. Mary Butler, Napa County's chief probation officer, said her agency uses three risk assessment tools, with the intend of providing recommendations to a presiding judge, but she said they also determine how best to help the individual in question succeed at establishing a life outside of prison.

"The needs part is as important as the risk part, because it helps to change their behavior," she said. "I can’t change, for example, the age when someone was first arrested, but if that’s the only thing I consider, then yeah, I could be biased a result. But when I tie that into everything else going on in that person’s life, and their areas of need, it's a really good tool."

She added that assessments made by her agency are shared with everybody involved. "We give the offender the results," she said. "We give that information to the court. The attorneys have it."

A small number of California law enforcement agencies are also using artificial intelligence. Two of the more popular products are Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) and the Public Safety Assessment (PSA). COMPAS has been used by the California Department of Corrections and Rehabilitation. PSA is in use in San Francisco, Santa Cruz and Tulare Counties.

But some of these programs, like COMPAS, are proprietary products, which means their owners don't share the source code in order to protect their intellectual property. In doing so, they prevent defendants from challenging the integrity of the models.

That also blocks independent researchers from being able to study and identify flaws in the programs, and prevents lawmakers from knowing what kind of improvements they should push for.

Like the Partnership on A.I., the Electronic Frontier Foundation has publicly expressed doubts about the rollout of the new bail law, SB 10, for this very reason. In an opinion piece on its website, the EFF argued, "The public must have access to the source code and the materials used to develop these tools, and the results of regular independent audits of the system, to ensure tools are not unfairly detaining innocent people or disproportionately affecting specific classes of people."

A different bill that cleared the state Senate this week, however, could set a lot of minds at ease. SB 36 would establish guidelines regarding the use of risk assessment tools, including data collection, transparency requirements and regular review and validation.

The bill, as it's written now, would prevent any risk assessment software tools that block access to their source code. That bill heads to the state Assembly next.

lower waypoint
next waypoint
Why California Environmentalists Are Divided Over Plan to Change Power Utility RatesWhy Renaming Oakland's Airport Is a Big DealAllegations of Prosecutorial Bias Spark Review of Death Penalty Convictions in Alameda CountyCecil Williams, Legendary Pastor of Glide Church, Dies at 94SF Democratic Party’s Support of Unlimited Housing Could Pressure Mayoral CandidatesBay Area Indians Brace for India’s Pivotal 2024 Election: Here’s What to Know‘Sweeps Kill’: Bay Area Homeless Advocates Weigh in on Pivotal US Supreme Court CaseNurses Warn Patient Safety at Risk as AI Use Spreads in Health CareCalifornia’s Future Educators Divided on How to Teach ReadingWhen Rivers Caught Fire: A Brief History of Earth Day