Artificially intelligent judge, jury, and executioner – will algorithms take over criminal justice?

Industries: Government
  • Are AI and smart algorithms the answer to racial bias in sentencing?
  • Just how ‘smart’ are these algorithms, anyway?
  • Justice and the ‘black box’: is this a new era of criminal justice secrecy?
  • Loomis v. Wisconsin: the ‘black box’ is good enough for the Supreme Court
  • Algorithms and criminal justice – not such a great idea after all

While just 3 per cent of the population in the UK, black men are no less than 13 per cent of the prison population. The numbers are just as bad in the US, where – though only 13 percent of the population – they’re 40 per cent of the people in prison. If you do the math, this means that black men are incarcerated at about 4 times the rate we would expect from a fair system. “Of those imprisoned in the United States”, Vinay Iyengar notes in the Harvard Political Review, “58 percent are black or Hispanic, despite these groups comprising only a quarter of the country’s total population.” These are startling numbers and it’s hard not to see systematic bias in the criminal justice system.

Free trendservice

Receive the latest insights, research material, e-books, white papers and articles from our research team every month, for free!

Are AI and smart algorithms the answer to racial bias in sentencing?

Such a clear problem has attracted a high-tech solution, and smart algorithms driven by cutting-edge artificial intelligence (AI) have been developed by a number of private firms. Some of these are already being used to assess risk, decide where to send police on patrol, and advise on sentencing. Sharad Goel, a professor at Stanford’s Department of Management Science and Engineering, explains that: “Data and algorithms are rapidly transforming law enforcement and criminal justice, including how police officers are deployed, how discrimination is detected and how sentencing, probation, and parole terms are set”. But he cautions that though the intentions behind this tech are good, “its use also raises complex legal, social, and ethical questions”.

Close-up of a hand holding a smartphone showing a man’s face mapped on its display, with blurred faces in the background
Such a clear problem has attracted a high-tech solution, and smart algorithms driven by cutting-edge artificial intelligence (AI) have been developed by a number of private firms.

Just how ‘smart’ are these algorithms, anyway?

Smart algorithms have been around for a while now, and whether you realise it or not, they’re keeping a close eye on you. The already help assess your credit-worthiness when you apply for a loan and calculate the fastest route to your destination on Google Maps. And it’s no secret that smart algorithms have proven their capacity to sift through masses of information and sort it for answers. But as Jason Tashea writes for Wired, this apparently flawless logic is anything but. Algorithms don’t think – they act quickly, efficiently, and accurately within the bounds of their code. But that basic programming is no less prone to bias and error than the people who write it.

A case in point: Tashea describes a stretch of motorway in rural Arkansas, well-known to locals for its steep grades and hairpin curves. Truckers who know the area know to avoid it, but the algorithms written for motorway navigation don’t assess these risks. As a result, they plot the most efficient route for large trucks, sending their drivers on a route unfit for commercial traffic. The result has been a dramatic increase in accidents and lots of drivers who no longer trust navigations apps. This is a great example of the problem for algorithms – they only consider what they’re told to consider. They can’t think outside the box.

Justice and the ‘black box’: is this a new era of criminal justice secrecy?

And it’s precisely that box that has critics worried. Most such algorithms are sold by private companies. In short, they’re proprietary software that’s not open to those who use it (or who are subject to it). Instead, much like your favourite navigation app, you see the results, but not the calculations or code that provides them. “There is little oversight and transparency regarding how they work,” Tashea warns. “Nowhere is this lack of oversight more stark than in the criminal justice system. Without proper safeguards, these tools risk eroding the rule of law and diminishing individual rights.” We don’t know how these algorithms function or what they consider – only their makers do. Worse still, the legal system can’t review the decisions or advice they produce – they can either take it or leave it.

Loomis v. Wisconsin: the ‘black box’ is good enough for the Supreme Court

Here’s an example of at least one significant legal challenge. Eric Loomis, a defendant in a drive-by shooting case, fled the police in a stolen car. After his arrest, his risk to the community and his risk to re-offend were defined by an algorithm. At his sentencing, this information helped inform a six-year prison term, later challenged by his legal team on the grounds that there was no way for his attorneys – or anyone else for that matter – to challenge that risk assessment. The algorithm’s judgment on Loomis was sealed, private, and ‘black-boxed’, the very opposite of how criminal justice should function. Despite this, the US Supreme Court declined to hear his case, leaving the lower court’s ruling in place, and leaving Loomis behind bars.

These algorithms are intended to offer help to overworked courts and guard defendants against bias. The companies that make them arrive with the best of intentions. But to make money, they keep secrets from their competitors, and this need for secrets means that the justice system itself can’t be certain what the algorithm considers when reaching its decisions. That might be fine for Google Maps, but in the courtroom that’s not nearly good enough. And despite the promise of neutrality, attempts to assess just how blind AI is have shown startling bias. As early as 2014, these problems were already gaining attention. The US Attorney General at the time, Eric Holder, reasoned that, “Although these measures were crafted with the best of intentions, I am concerned that they inadvertently undermine our efforts to ensure individualised and equal justice”. A number of cases have demonstrated this bias, and risk assessments are often perversely wrong, reinforcing stereotypes and undermining racial justice.

Two men in dark-green shirts sitting, the one in the foreground talking, while the other is blurred in the background
The algorithm’s judgment on Loomis was sealed, private, and ‘black-boxed’, the very opposite of how criminal justice should function.

Algorithms and criminal justice – not such a great idea after all

Algorithms do a great many things well. But criminal justice is not one of them. As Christopher Markou, a Ph.D. candidate in law at Cambridge, worries, they represent the black-boxing of criminal justice and a return to star-chambers and closed courts where defendants need not face their accusers or hear reasoned arguments about their guilt. Within these systems, guilt and innocence, risk and sentencing, are a closed circuit, hidden from the very institutions that enforce them. Markou sees this as a critical failing. “Legal systems depend on continuity of information, transparency and ability to review,” he writes. “What we do not want as a society is a justice system that encourages a race to the bottom for AI startups to deliver products as quickly, cheaply and exclusively as possible. While some AI observers have seen this coming for years, it’s now here – and it’s a terrible idea.”

Industries: Government
We’re in the midst of a technological revolution and the trends, technologies, and innovations to look out for are all game-changers. They bring competitive advantages, increase the effectiveness of operations, make our daily lives more efficient, improve healthcare, and significantly change the landscape and beyond.

Free trendservice

Receive the latest insights, research material, e-books, white papers and articles from our research team every month, for free!