24 Mar 2022

Managing Innovation: How trust should be at the heart of modernisation in policing

Annika Ramos, Trish Ani and Lola Evans, PwC discuss how innovation can be blocked unless the public feel like they can trust the technologies that the police are using. This article refers to a methodology that has been used in policing to help them overcome this challenge part of techUK's Emerging Tech in Policing Week. #DigitalPolicing

The duty of UK police is to protect the public and prevent crime. They keep watch over the public, and in turn, the public also keeps a watchful eye over policing activities. The police are publicly funded, public-facing, and publicly accountable. They draw legitimacy and power from public approval, consent, and cooperation. It stands to reason that trust, inclusion, and ethics are critical factors for policing institutions. 

To paint a picture of the current situation regarding trust and confidence in UK policing, a 2021 YouGov survey reported a decline in confidence over the police force. A year later, the 2022 Strategic Review of Policing within England and Wales reported a crisis of confidence in British policing institutions both from the public and within the force itself. It is no wonder that trust is a massive consideration when it comes to the use of technology within policing. 

Policing is in need of modernisation. Technological advancements have meant that policing now has the opportunity to better predict, anticipate and prevent crime. Embracing emerging technologies will drive efficiencies in existing processes and better protect citizens. Examples include: 

  • Predictive Policing: Triages incidents more effectively and predicts where crime may occur.  

  • Conversational Chatbots: Triages enquiries and supports those who are in most need, especially the vulnerable, who are victims of crimes. 

  • Evidential Analysis using Machine Learning: Enables investigators to identify victims and accomplice offenders through analysis of messages, seized images and other forms of relevant information.  

Such techniques are notoriously complex and care must be taken when interpreting results. These inferences and conclusions need to be accurate. Without appropriate assurance mechanisms, the three examples above could lead to unethical outcomes, discriminate against particular demographics and erode trust from the public. It could lead to misidentifying someone as an offender due to their ethnicity. Or a chatbot might provide insensitive advice because of the patterns learnt from the dataset. Finally, this could even extend to the potential exacerbated psychological harm to the data scientists who could be exposed to graphic images. The unethical consequences are not only towards the public, but the whole stakeholder chain.  

Organisations are adopting emerging technologies at a pace faster than regulators can keep up. However, we are increasingly seeing an increase in regulatory enforcement activity. With the lack of guidance on what they “should do”, the organisation’s moral values and code of conduct becomes more and more important. Regulators and think tanks are increasingly releasing ethical guidelines when adopting technologies such as artificial intelligence (AI). Companies such as Microsoft and IBM have taken bold actions and banned the use of facial recognition systems due to the discriminatory risk it poses. 

Worries about algorithmic black boxes, bias and fairness tend to be at the forefront of conversations surrounding usage of technology and analytics (e.g. AI and machine learning). But they don’t have to be blockers. With the right controls and considerations, technology has the power to modernise policing and prevent more crimes.  

Striking the balance 

Some police forces have developed a methodology to provide a unified way of assuring data capabilities, enabling innovation at pace, while maintaining public trust. The methodology considers strategy, feasibility, lawfulness, ethicality and resilience; driving a culture of risk management and accountability within their organisation.  

Police forces need to strike a balance between encouraging innovation while ensuring trust and ethics are at the forefront, otherwise the status quo will prevail, and the use of innovative technologies in policing will be stifled both from within and by the public.  

Authors:

Lola Evans

Annika Ramos

Trish Ani (Twitter Handle: @trishia_ani) 

Georgie Morgan

Georgie Morgan

Head of Justice and Emergency Services, techUK

Georgie joined techUK as the Justice and Emergency Services (JES) Programme Manager in March 2020, then becoming Head of Programme in January 2022.

Georgie leads techUK's engagement and activity across our blue light and criminal justice services, engaging with industry and stakeholders to unlock innovation, problem solve, future gaze and highlight the vital role technology plays in the delivery of critical public safety and justice services. The JES programme represents suppliers by creating a voice for those who are selling or looking to break into and navigate the blue light and criminal justice markets.

Prior to joining techUK, Georgie spent 4 and a half years managing a Business Crime Reduction Partnership (BCRP) in Westminster. She worked closely with the Metropolitan Police and London borough councils to prevent and reduce the impact of crime on the business community. Her work ranged from the impact of low-level street crime and anti-social behaviour on the borough, to critical incidents and violent crime.

Email:
[email protected]
LinkedIn:
https://www.linkedin.com/in/georgie-henley/

Read lessmore