25 Jun 2024
by Alison Coote

Policing the future with AI: A guide to leveraging industry support

The current climate is marked by escalating demands on the criminal justice system and intense pressure to cut costs (without sacrificing quality or efficiency). In these circumstances, integrating AI isn't a luxury – it's a necessity. AI has the potential to revolutionise the criminal justice landscape, offering innovative solutions that can enhance accuracy, streamline processes and, ultimately, deliver justice and policing more effectively.    

 

But what does ‘the right’ approach to AI look like? This question requires input from outside the justice system – so you can leverage learnings and experiences from the rest of the public sector (as well as the private sector).  

In this article, I outline how you can meet challenges and exceed expectations by making the most of industry partner AI expertise and support focusing on policing. This includes:  

 

  • Understanding area’s that AI can support’ – when it comes to addressing operational challenges with AI  

  • Using AI to empower – and make it easier for police staff and officers to focus on the important elements of their job  

  • Educating the workforce about appropriate roles for AI – including what it shouldn’t do.  

  • Building an ethics framework for AI use – so people trust that it’s enhancing quality and efficiency, which is particularly important in the criminal justice space. 

 

Translating operational challenges into AI solutions  

All organisations, regardless of sector, face four major challenges when it comes to adopting AI:   

  • Organisational alignment and business value  

  • Responsible AI use  

  • Properly prepared and curated data  

  • Accessing the right tools and talent 

 

A common initial barrier of major AI adoption in policing specifically is a lack of understanding of how AI can alleviate resource pressure and drive quality improvements.   

Industry can add real value here because partners can help frame operational challenges in a way that aligns with appropriate technology solutions.  

 

Where can AI support operational challenges?   

Current approaches to AI adoption within the criminal justice system are focused on identifying ways to increase efficiency and effectiveness by eliminating frustrating tasks and admin. This starts with contextualising the operational challenges it can address within police forces.  

 

Here are four common operational challenges which could benefit from AI across police forces we have been engaged with:  

 

  • Managing paperwork and reducing the admin burden – A recent innovation workshop we ran for a force uncovered paperwork and growing admin as a common reason officers cite for leaving the police force. An officer could easily spend half a shift just redacting evidence. And that’s before you get to filling in forms or writing up witness statements.  

  • Locating and surfacing evidence quickly – given the vast amount of digital evidence, from text messages to search histories to video footage, locating important elements can take hours.  

  • Improving case quality – the UK has a high proportion of officers with less than three years’ experience. How do we ensure a consistently high quality of case files going to the Crown Prosecution Service without putting additional pressure on longer serving staff  

  • Handling non-emergency citizen calls – as contact centres deal with increasing volumes, staff need a way to speed up call handling while ensuring requirements are being actioned.  

 

Empowering the workforce by eliminating time-consuming admin  

Once you’ve articulated the challenges, an industry partner can help prioritise use cases where AI will deliver clever improvements – without replacing essential human elements.  

For instance (aligning with the operational challenges above):  

 

  • Managing paperwork – automating case preparation by utilising techniques such as speech-to-text, identifying words/sentences that may be pertinent to the case 

  • Locating and surfacing evidence quickly – automating searching of evidence highlighting trends and commonalities across different sources   

  • Improving case quality – highlighting potential missing information from cases, summarising key facts  

  • Handling non-emergency citizen calls – digital assistants, automated call triaging, call transcription and translation, accessibility and inclusivity by design, sentiment analysis  

 

Building trust in AI  

Trust is crucial to the success of leveraging AI within policing and the wider criminal justice system. How do you educate the public about how AI can help safely solve operational challenges within the justice system? Industry can support this process in various ways.  

Understanding what AI is (and isn’t)  

A common AI misconception is that it makes decisions for people – that it will present a conclusion, followed by the evidence for that conclusion. This isn’t the case.   

AI doesn’t replace the human experience and intuition essential to quality policing. Rather, it:  

  • Automates repetitive tasks 

  • Surfaces information that’s difficult to access  

  • Enables people to analyse at speed 

 

In fact, I often say that if we called it IA (for intelligent automation), people would be much more receptive to the opportunities it offers!  

Not only can industry partners support digital and data-related upskilling, but they can also educate stakeholders at all levels about AI’s role in optimising processes. With a transparent process, people are then more confident as they understand how AI will make their life easier at the same time understanding what it doesn’t touch. This builds acceptance of new tools and ways of working, increasing uptake (and ROI).  

 

Building an ethics framework for AI use  

Although organisations have governance and data-sharing frameworks, they often lack an ethics framework or codified point of view for deploying AI. And that is essential for ensuring AI use is transparent, unbiased, and responsible.  

You can get started with a framework in weeks. The process starts with an AI ethics maturity assessment, which then provides the basis for a responsible AI framework and implementation roadmap.   

This gives you a robust AI governance approach for managing and monitoring AI risks. It therefore helps you build trust that facilitates engagement, de-risks implementation and supports data protection compliance.  

 

Where can you enhance quality and efficiency?  

The first movers in AI adoption have already made progress. In a trial of a POC we conducted which looked at one of the key stages of preparing evidence, the force reported a 65% saving on time. Now, it’s time for the criminal justice system to take advantage of the opportunities and learning from the experiences of those who have paved the way.  

And support is ready and waiting. So, what operational inefficiencies can you take to an industry partner? Contact me to discuss how to translate them into AI solutions – in a way that aligns with organisational goals, empowers the workforce, and ensures responsible use.  


Georgie Morgan

Georgie Morgan

Head of Justice and Emergency Services, techUK

Georgie joined techUK as the Justice and Emergency Services (JES) Programme Manager in March 2020, then becoming Head of Programme in January 2022.

Georgie leads techUK's engagement and activity across our blue light and criminal justice services, engaging with industry and stakeholders to unlock innovation, problem solve, future gaze and highlight the vital role technology plays in the delivery of critical public safety and justice services. The JES programme represents suppliers by creating a voice for those who are selling or looking to break into and navigate the blue light and criminal justice markets.

Prior to joining techUK, Georgie spent 4 and a half years managing a Business Crime Reduction Partnership (BCRP) in Westminster. She worked closely with the Metropolitan Police and London borough councils to prevent and reduce the impact of crime on the business community. Her work ranged from the impact of low-level street crime and anti-social behaviour on the borough, to critical incidents and violent crime.

Email:
[email protected]
LinkedIn:
https://www.linkedin.com/in/georgie-henley/

Read lessmore

Cinzia Miatto

Cinzia Miatto

Programme Manager - Justice & Emergency Services, techUK

Cinzia joined techUK in August 2023 as the Justice and Emergency Services (JES) Programme Manager.

The JES programme represents suppliers, championing their interests in the blue light and criminal justice markets, whether they are established entities or newcomers seeking to establish their presence.

Prior to join techUK, Cinzia held positions within the third and public sectors, managing international and multi-disciplinary projects and funding initiatives. Cinzia has a double MA degree in European Studies from the University of Göttingen (Germany) and the University of Udine (Italy), with a focus on politics and international relations.

Email:
[email protected]

Read lessmore

Ella Gago-Brookes

Ella Gago-Brookes

Team Assistant, Markets, techUK

Ella joined techUK in November 2023 as a Markets Team Assistant, supporting the Justice and Emergency Services, Central Government and Financial Services Programmes.  

Before joining the team, she was working at the Magistrates' Courts in legal administration and graduated from the University of Liverpool in 2022.  Ella attained an undergraduate degree in History and Politics, and a master's degree in International Relations and Security Studies, with a particular interest in studying asylum rights and gendered violence.  

In her spare time she enjoys going to the gym, watching true crime documentaries, travelling, and making her best attempts to become a better cook.  

Email:
[email protected]

Read lessmore

 

Authors

Alison Coote

Alison Coote

Head of product and innovation, Kainos