ERP News

Big Data Alone Can’t Fix a Broken Bail System

120 0
Philadelphia should think twice about its risk-assessment algorithm.
Broken Bail System

Broken Bail System

In the U.S., getting arrested can be extremely punishing, even for the innocent. People who can’t afford bail often languish in pre-trial detention for months, missing school, work and other responsibilities crucial to their livelihoods. Some plead guilty just to get out. The burden falls disproportionately on people from heavily policed minority neighborhoods.

Authorities in Philadelphia think an algorithm might help where the human-run system has failed. Actually, it could make things even worse.

Philly has long been one of the most incarcerated cities in America. Families commonly do time together. At any given moment, about 3 percent of residents are on probation. People in pretrial detention — that is, people who should be presumed innocent — account for 30 percent of the overall prison population.

Now, Philly is joining the nationwide movement for bail reform. As part of a plan backed by a $3.5 million MacArthur Foundation grant, the city intends to employ a computer algorithm to help judges decide who can be trusted to return for a trial date. Using data such as age, the nature of the offense, and the number of previous arrests, the algorithm will spit out a risk-assessment score.

Objective as this might sound, activists are concerned. Hannah Sassaman, policy director of the Media Mobilizing Project, a Philadelphia-based anti-poverty group, sees a number of problems with such algorithms: They don’t necessarily get more people out of jail, they can’t compensate for judges’ biases, and they can actually reinforce biases by using inputs that serve as proxies for race.

The evidence backs up such concerns. In Lucas County, Ohio, the introduction of one commonly used algorithm — the Laura and John Arnold Foundation pretrial risk tool — had mixed results. In Kentucky, pretrial risk assessments had only a small and temporary effect on judges’ decisions, and did not change the racial disparity in detentions. In Chicago, despite an encouraging decline in the use of money bonds, the algorithms didn’t reliably change judges’ behavior.

That’s not to say that judges should strictly follow an algorithm’s recommendations. If a computer uses inputs that are correlated with race and class, it can be as unfair to certain groups as any bigoted human. Consider zip codes: If an algorithm puts weight on them, it will automatically be biased against people from neighborhoods where a lot of people get arrested and miss trial dates — which, in our segregated country, tends to mean black neighborhoods. The same goes for including prior arrests: In a country where whites and blacks smoke pot at similar rates but blacks are four times more likely to be arrested for the offense, this will unfairly lead to worse scores for blacks.

Read More Here

Article Credit: Bloomberg

Leave A Reply

Your email address will not be published.

*

code