25 Jun 2024
by Andy Davies

Investigating AI: industry support for AI adoption in criminal justice

Artificial Intelligence (AI) has now become a vital took to inform decision making and strategy in a wide range of business areas. Organisations that successfully leverage AI tools have distinct advantages, with AI Machine Learning (AI/ML) invaluable for finding actionable insights within data. For the MoJ, successful adoption of AI could improve prison safety, justice outcomes, citizen services and staff wellbeing – and save money.

So how can the MoJ learn from broader industry experience? Consider how industry is training its algorithms to eliminate bias – this knowledge could be used to deliver fairer sentencing, for example. By adopting techniques similar to those used to predict demand for goods on supermarket shelves, could the Prison Service improve how it predicts demand for cell usage? The calculation of risk in the insurance industry could be adapted to predict risk of violence in inmate populations. Analysis of IoT data used to support smart cities could be applied to prisons to improve prison safety. Healthcare providers use AI to predict peak demand – this approach could help inform resource planning and allocation across the justice sector.

Advanced analytical techniques are already being used within the justice sector in other parts of the world to support decision making and drive improvements to justice outcomes:

  • One US Department of Corrections has reduced recidivism rates for the first time in seven years using insight derived from AI driven dashboards.
  • Another has used AI and predictive analytics with real-time data to reduce incidents of violence against staff by 50%.
  • A US Justice Agency has produced efficiency savings estimated at $12 million annually by providing a holisitic view of each offender through a single solution.

Important considerations for adopting AI

As with any organisation leveraging AI today, the MoJ’s approach will need to consider:

Transparency: Justice agencies must be open about their use of AI tools and data sources and be able to understand and explain what they do with the technology. This openness will build public trust and allow informed debates, for example on the scope of generative AI technologies.

Accountability: Developers and users of predictive tools need mechanisms in place to ensure AI predictions do not become the sole basis for justice actions. Rigorous testing on past cases, including miscarriages of justice, will help to ensure AI performs on “edge” cases.

Open source integration: A holistic AI platform combines Open Source technologies with Commercial off the Shelf (COTS) solutions - uniting the agility of OS with COTS’ governance, auditability and speed of deployment.

Auditability: A robust governance and audit process for records and versioning will identify which user and which version of a model made a decision at a particular point in time. Testing and deployment processes including associated sign offs for models in production need to be auditable.

Human in the loop: Officers and staff will always be irreplaceable in justice with a wealth of experience, the focus needs to be on finding innovative ways to make AI and the results of AI accessible to them.

Bias Mitigation and agility: Regular audits and model updates are essential to identify and mitigate biases in data and algorithmic design and to align with evolving societal norms and legal standards. As AI model performance degrades, an AI platform should manage the complete model lifecycle including model development, implementation and use, automation and simplification, ensuring end-to-end traceability and delivering changes promptly and accurately.

Community Engagement: Involving the community can lead to more equitable practices and foster a collaborative approach to justice outcomes.

Performance and cost effectiveness: Proactive steps to reduce AI’s carbon footprint are essential. The energy required to run AI tasks is already accelerating with an annual growth rate between 26% and 36%. This means by 2028, AI could be using more power than the entire country of Iceland used in 2021. See the Futurum report.

Actionable: AI outputs should be accessible, explainable and available through a user-friendly interface with explainable text descriptions that support onward investigation.

In summary

An integrated, holistic AI platform addresses these considerations and aligns with the MoJ’s goal to improve justice outcomes. AI will always need the oversight of experienced professionals to harness its benefits. If the lessons from wider industries translate to the justice sector, the benefits could be widespread.


Georgie Morgan

Georgie Morgan

Head of Justice and Emergency Services, techUK

Georgie joined techUK as the Justice and Emergency Services (JES) Programme Manager in March 2020, then becoming Head of Programme in January 2022.

Georgie leads techUK's engagement and activity across our blue light and criminal justice services, engaging with industry and stakeholders to unlock innovation, problem solve, future gaze and highlight the vital role technology plays in the delivery of critical public safety and justice services. The JES programme represents suppliers by creating a voice for those who are selling or looking to break into and navigate the blue light and criminal justice markets.

Prior to joining techUK, Georgie spent 4 and a half years managing a Business Crime Reduction Partnership (BCRP) in Westminster. She worked closely with the Metropolitan Police and London borough councils to prevent and reduce the impact of crime on the business community. Her work ranged from the impact of low-level street crime and anti-social behaviour on the borough, to critical incidents and violent crime.

Email:
[email protected]
LinkedIn:
https://www.linkedin.com/in/georgie-henley/

Read lessmore

Cinzia Miatto

Cinzia Miatto

Programme Manager - Justice & Emergency Services, techUK

Cinzia joined techUK in August 2023 as the Justice and Emergency Services (JES) Programme Manager.

The JES programme represents suppliers, championing their interests in the blue light and criminal justice markets, whether they are established entities or newcomers seeking to establish their presence.

Prior to join techUK, Cinzia held positions within the third and public sectors, managing international and multi-disciplinary projects and funding initiatives. Cinzia has a double MA degree in European Studies from the University of Göttingen (Germany) and the University of Udine (Italy), with a focus on politics and international relations.

Email:
[email protected]

Read lessmore

Ella Gago-Brookes

Ella Gago-Brookes

Team Assistant, Markets, techUK

Ella joined techUK in November 2023 as a Markets Team Assistant, supporting the Justice and Emergency Services, Central Government and Financial Services Programmes.  

Before joining the team, she was working at the Magistrates' Courts in legal administration and graduated from the University of Liverpool in 2022.  Ella attained an undergraduate degree in History and Politics, and a master's degree in International Relations and Security Studies, with a particular interest in studying asylum rights and gendered violence.  

In her spare time she enjoys going to the gym, watching true crime documentaries, travelling, and making her best attempts to become a better cook.  

Email:
[email protected]

Read lessmore

Digital Justice updates

Sign-up to get the latest updates and opportunities on our work around Digital Justice from our Justice and Emergency Services programme.

 

Authors

Andy Davies

Andy Davies

Client Manager, Public Sector, SAS Software Ltd