STUDY: Criminal Profiling Software Used in Sentencing is Biased Against Black People

By Sameer Rao May 23, 2016

A new report from ProPublica took on a task that the U.S. Sentencing Commission has yet to—a study of the computer-generated risk assessment scores used in courtrooms across the country to determine whether defendants will commit further crimes. The nonprofit newsroom found that assessment software often introduces bias against Black defendants.

As the report notes, former U.S. Attorney General Eric Holder warned against this type of software in 2014. "Although these measures were crafted with the best of intentions, I am concerned that they inadvertently undermine our efforts to ensure individualized and equal justice," he said. Despite his warnings, more and more courtrooms use this software, and a sentencing reform bill currently pending Congressional approval would mandate its use in federal prisons. 

ProPublica examined more than 7,000 risk scores assigned to people arrested in Florida’s Broward County in 2013 and 2014. Researchers then checked to see if they committed new crimes in the subsequent two years, per a standard used by the algorithm in a widely used assessment software developed by Northpointe. Their findings indicate that the algorithm made inaccurate predictions for both White and Black subjects, but Black defendants were wrongly flagged as future criminals at almost twice the rate as White ones even after controlling for race.

Northpointe disputed the finding, telling ProPublica that it "does not agree that the results of your analysis, or the claims being made based upon that analysis, are correct or that they accurately reflect the outcomes from the application of the model."

Check out the full piece here