Originally dubbed the “homicide prediction project,” the plan is now operating under the blandly bureaucratic label “sharing data to improve risk assessment,” though its ambitions are far from benign. The Ministry of Justice wants to use data from police, probation services, and health records to build an algorithmic crystal ball -- rather similar to the movie Minority Report, only without pyschics or Tom Cruise.
The surveillance-busting crew at Statewatch unearthed the project via Freedom of Information digging. It alleges that data from people never convicted of a crime—including victims of abuse and those who self-harmed—has been funnelled into the system. MoJ officials, for their part, insist the tool only uses records from those with at least one conviction.
But documents show the net is cast wider. Greater Manchester Police’s data-sharing agreement includes when a person first had contact with police and even their age when first recorded as a domestic violence victim. There’s “special category” data—mental health, addiction, disability, suicide, and self-harm indicators—filed under what’s expected to have “significant predictive power.”
Ministry of Justice officials claim it’s just a research phase, aiming to “review offender characteristics that increase the risk of committing homicide” and test “alternative and innovative data science techniques.” They say it’ll help refine risk assessments already used in probation and custody decisions.
Statewatch researcher Sofia Lyall slammed the scheme: “The Ministry of Justice’s attempt to build this murder prediction system is the latest chilling and dystopian example of the government’s intent to develop so-called crime ‘prediction’ systems.”
“Like other systems of its kind, it will code in bias towards racialised and low-income communities. Building an automated tool to profile people as violent criminals is deeply wrong,” she said.
Despite the soothing assurances, critics aren’t buying it—especially given the UK’s rap sheet for institutional bias. If this is public protection, it’s a version with predictive profiling, mass surveillance, and systemic discrimination baked into the code.