...

UK Police Going Full Minority Report, Building ‘Murder Prediction’ Tool

In the latest “We have created the Torment Nexus from classic sci-fi novel Don’t Create The Torment Nexus” news, The Guardian reports that the United Kingdom government is developing a prediction algorithm that will aim to identify people who are most likely to commit murder. What could go wrong?

The report, which cites documents acquired via Freedom of Information requests by transparency organization Statewatch, found that the Ministry of Justice was tasked with designing a profiling system that can flag people who seem capable of committing serious violent crimes before they actually do so. The so-called Homicide Prediction Project (re-named to the “Sharing Data to Improve Risk Assessment” project so as to not come off as so explicitly dystopian) sucked up the data of between 100,000 and 500,000 people in an effort to develop models that could identify “predictors in the data for homicide risk.”

The project includes data from the Ministry of Justice (MoJ), the Home Office, Greater Manchester Police (GMP), and the Metropolitan Police in London. The records reportedly are not limited to those with criminal records but also include the data of suspects who were not convicted, victims, witnesses, and missing people. It also included details about a person’s mental health, addiction, self-harm, suicide, vulnerability, and disability—”health markers” that the MoJ claimed were “expected to have significant predictive power.” The Guardian reported that government officials denied the use of data of victims or vulnerable populations, and insisted that only data from people with at least one criminal conviction was used.

It doesn’t take a whole lot to see how bad of an idea this is and what the likely end result would be: the disproportional targeting of low-income and marginalized people. But just in case that isn’t obvious, you just have to look at previous predictive justice tools that the UK’s Ministry of Justice has rolled out and the results they produced.

For instance, the government’s Offender Assessment System is used by the legal system to “predict” if a person is likely to reoffend, and that prediction is used by judges in sentencing decisions. A government review of the system found that among all offenders, actual reoffending was significantly below the predicted rate, especially for non-violent offenses. But, as you might imagine, the algorithm assessed Black offenders less accurately than white offenders.

That’s not just a Britain problem, of course. These predictive policing tools regularly misassess people no matter where they are implemented, with risks associated with marginalized communities skewed—the result of racial biases found within the data itself that stem from historical over-policing of communities of color and low-income communities that lead to more police interactions, higher arrest rates, and stricter sentencing. Those outcomes get baked into the data, which then gets exacerbated by the algorithmic processing of that information and reinforces the behaviors that lead to uneven outcomes.

Anyway, just as a reminder: we were not supposed to embrace the predictive nature of the Precogs in Minority Report—we’re supposed to be skeptical of them.

Leave a Comment

Seraphinite AcceleratorOptimized by Seraphinite Accelerator
Turns on site high speed to be attractive for people and search engines.