Interesting article in the Wall Street Journal about parole boards using software to predict repeat offenders before letting someone go free.
What used to be a decision based on good behavior during time served, showing remorse to the parole board, and intuition is being augmented with "automated assessments" that include inmate interviews, age of first arrest, type of crime, and so forth.
At least 15 states have adopted "modern risk assessment methods" to determine the potential for recidivism.
Individuals are marked as higher risk if they are:
- Young--age 18-23 (and impulsive)
- Offense was drug-related
- Suspended or expelled from school
- Quit a job prior to having another one
- Single or separated
- Diagnosed with a mental disorder
- Believes that it's not possible to overcome their past.
Surprisingly, violent criminals (rapists and murders) are actually considered lower risk those guilty of nonviolent property crimes--the thinking being the someone convicted of robbery is more likely to repeat the criminal behavior because the crime is one that "reflects planning and intent."
Honestly, I think it is more than ridiculous that we should rank violent criminals less risky than thieves and release them because they had what is considered an "emotional outburst."
Would you rather have some thieves back on the street or murders and rapists--rhetorical question!
But it just shows that even the best of systems that are supposed to help make better decisions--can instead be misused or abused.
This happens when there is either bad data (such as from data-entry mistakes, deceptive responses, and missing relevant information) or from poorly designed decision rules/algorithms are applied.
The Compas system is one of the main correctional software suites being used, and the company Northpointe (a unit of Volaris) themselves advise that officials should "override the system's decisions at rates of 8% to 15%."
While even a 1/7 error rate may be an improvement over intuition, we need to still do better, especially if that 1 person commits a violent hideous crime that hurts someone else in society, and this could've been prevented.
It's certainly not easy to expect a parole board to make a decision of whether to let someone out/free in 20 minutes, but think about the impact to someone hurt or killed or to their family, if the wrong decision is made.
This is a critical governance process that needs:
- Sufficient time to make important decisions
- More investment in tools to aid the decision process
- Refinement of the rules that support release or imprisonment
- Collection of a broad base of interviews, history, and relevant data points tied to repeat behavior
- Validation of information to limit deception or error.
Aside from predicting whether someone is likely to be repeat offenders, parole boards also need to consider whether the person has been both punished in accordance with the severity of the crime and rehabilitated to lead a productive life going forward.
We need to decide people's fates fairly for them, justly for the victims, and safely for society--systems can help, but it's not enough to just "have faith in the computer." ;-)
(Source Photo: Andy Blumenthal)