Actuarial Risk Assessments in Criminal Justice Do Not Work
4 min read | Dec 2020

Actuarial Risk Assessments in Criminal Justice Do Not Work

The community services available to ex-convicts and sex offenders were great—until corporate entities cut costs with risk assessments.

Johnnie Gun / Gen X / Anarchist / Musician

Within criminology, my field of work, the advent of actuarial crime models—informed by the language of the insurance world—has been accompanied by a shift in emphasis. Solutions to crime are not sought out, and the social or economic canvas on which behavior occurs is deemed unimportant. Today, the system depends extensively on the harvesting and analysis of data to assess risk. It’s a system that eats its young and is riven with inconsistencies and dubious practices. Money is the imperative. The wellbeing of victims and perpetrators are secondary. This concerns me greatly.

What I want to address below are concerns based on my experience working within a community substance-misuse service. The benefits of the science and technologies of data management are undeniable—but who amongst us could claim to have no second thoughts about our ever-increasing dependence on computer models and processed risk assessments? Where, one might ask, is the space for intuition and a professional human eye? 

Risk Management in Criminal Justice Is About People

In a previous article, I discussed my experience working with survivors of the Grenfell Tower fire. Prior to that event, I spent 18 months working with criminal offenders sentenced to drug and alcohol rehab programs in the same postcode area of London. A portion of them were sex offenders. I can honestly report, when I started working within this particular community service, the service was exemplary.  

My team had access to psychiatrists and addiction doctors, along with employment, education and training. The service program provided a drop-in for clients in recovery, a food bank and legal aid and advice for clients. We had time to get to know our service users and criminal justice clients. I remember well supervising two psychology major interns from the U.S. who were bowled over by the amenities we had managed to build. 

Unfortunately, after a contract tendering process, the service was taken over by two new corporate entities. Their agenda was to streamline the service and cut budgets—the centerpiece of their operations model was a new database system and an actuarial approach to risk assessment based on, essentially, a tick-box exercise. This shift in emphasis, in my honest opinion, tore the heart and soul out of a program that had been delivering an outstanding service for clients and, importantly, boasted an inspiring work environment. We worked effectively managing criminal justice referrals, and had the authority and professional access to advise an individual’s probation officer on a course of action, including breaching (returning an offender into custody), if a client’s behavior had given cause for concern.   

It had been hands-on work, and the professional integrity of decisions was checked and balanced through conversations with psychiatrists, doctors and educational practitioners in regard to a team-wide decision-making process. This was purposefully and completely undermined by our new “employers.” They wanted us to crunch more numbers, highlight sketchy, dishonest outcomes and, dangerously, shy away from any negative client decisions—particularly a breach, which could see a client back in prison. Breaches looked bad for our new bosses; the consequences for the community were disregarded. It was painful to be a part of this sham, and many of us looked elsewhere for work. It was hard not to see it as an exercise in squeezing profits out of community services—and if not profits, generous salaries and conditions for senior management.


Sex Offender Risk Assessment Tools Include Humans

My personal role shifted from counseling and supervising clients to completing data charts, and treatment outcome profile surveys (TOPS forms). We punched numbers and, in response, the computer pumped out a series of results, which included a risk assessment. For clients sentenced by the courts to attend our service, increased risks would result in enhanced restrictions, or a breach and imprisonment.

The time staff members once spent working with the court-referred clients was instead taken up with corporate induction days, endless training and, of course, data entry, analysis and management. The end result was that dangerous, predatory sex offenders were allowed way more leeway than was safe. Soon, the inevitable happened.  

One man with a history of domestic violence and sexual assault (involving a 14-year-old neighbor's daughter) hadn't ticked enough boxes to be "breached" by this automated system. My concerns, and those of my colleagues, were ignored. It only mattered what this database assessed. It was essentially an institutional move to replace a well-paid, highly-trained staff with interns, volunteers and support workers. 

And what of the consequences? 

The offender I referred to attacked yet another woman he became involved with, hospitalizing her and leaving her with facial scars for life. It’s the harsh reality of a system, once functioning so well, being sacrificed to the measurement of abstractions. We have an instinctual, ambiguous concept of “risk,” but no broad, fundamental agreement about what it actually is. One person’s risk is another person’s fun.


Criminal Justice Risk Assessments Are Only One Example

This is just a microcosm of the societal risk we take when statisticians—and the technologies they deploy—are left to calculate risk using actuarial models. It also speaks to the U.K. government's obsession with statistical models and algorithms. Neil Ferguson's analysis of COVID-19, undertaken at Imperial, was as far off the mark as his disastrous modeling of the foot and mouth crisis over a decade ago. His "predictions" led to the unnecessary slaughter of millions of healthy animals and the bankruptcy of thousands of farmers.

I’ve struck out on a tangent to galvanize my points. Actuarial models have permeated all levels of health, social and criminal justice systems in the U.K. Their promoters cite savings and efficiencies by predicting risk. Sadly, experience has taught me that wherever actuaries and analysts are used, there is little time for justice, and even less for human beings.  It’s not what you do—it’s how many boxes you tick. Technologies have their place, but they should never have the last word. 

This Narrative Belongs To:

Next Up