Aggressive Criminal Defense

Predictive Analytics – Smart Planning or Racial Profiling?

analytics background

The use of Predictive Analytics is widespread in today’s world, and is exceedingly common in business. From projecting increases in certain sales, to predicting the spending habits of buyers, it has made the cost of doing business for companies all over the world that much easier. But when it comes to the use of predictive analytics in the criminal justice system, there are some who believe that it has less than stellar results.

 

Predictive analytics uses a wide variety of techniques from data mining

For those of you who haven’t encountered this term before, predictive analytics refers to an advanced form of analytics used to make predictions about unknown future events. Predictive analytics uses a wide variety of techniques from data mining, statistics, modeling, machine learning, and artificial intelligence, to analyze current data and then make predictions about the future.

 

So why would predictive analytics in criminal justice be in any way controversial, you may wonder? Well, the argument is that, in the same way that you wouldn’t use a hammer to make a cup of coffee or a ruler to mow your lawn, predictive analytics shouldn’t be used to inform sentencings or determine an individual’s risk of reoffending.

 

This opinion was held by then-Attorney General Eric Holder. In 2014 he addressed this issue with the U.S. Sentencing Commission, expressing open concern about the use of predictive analytics in our criminal justice system. Holder’s concern was that the use of predictive analytics in criminal justice would actually undermine the criminal justice system’s attempts to provide equal justice to all people. He felt that by profiling people without regard for their circumstances and without recognition of their individuality, the use of predictive analytics could in fact exacerbate unjust disparities between individuals that are already too common in today’s world both in society and in the criminal justice system.

 

Although predictive analytics was introduced as a beneficial intervention because it would allow agencies to reduce incarceration, and also create rehabilitation and support services for defendants in need. But there are those who believe that the practical applications carry some very disturbing realities. For example, the belief that predictive analytics, when applied to criminal justice, is nothing more than computerized racial profiling.

 

When a person is processed through the criminal justice system – arrested, jailed, arraigned – predictive analytics looks at a number of factors in order to determine whether or not that person is at a ‘high risk’ or ‘low risk’ for reoffending. This risk assessment that is generated every time someone is processed, takes into account the color of their skin. And this factor appears to affect the result. The allegation is that the software carries a racial bias in the way that it assesses individuals, because a black person convicted of a crime can be ranked as being at a higher risk of reoffending than a white person accused of exactly the same crime.

 

Professor Sonja Starr of the University of Michigan Law School believes exactly that. She has written extensively about what she views as a trend in which predictive analytics are used to flag people as “high risk” or “low risk.” The factors which influence this outcome, Starr says, are usually closely affiliated with poverty, which means that according to the risk assessment software, poor people are automatically considered to be a higher risk for future criminal behavior. Even more disturbing, Starr points out, is the way in which risk assessment is used to inform sentencing decisions.

 

But not everyone is against predictive analytics as it applies to the criminal justice system. Join us next time as we look at a police department that has seen reductions in crime using big data as a tool, and how using integrated software can save both time and money for everyone involved in criminal justice.

Exit mobile version