What You Should Know About the UK's New Cyber Standard
On 31 January 2025, the government introduced its Code of Practice for the Cyber Security of AI. Representing a crucial step in creating a secure environment for AI innovation while protecting digital infrastructure from emerging threats.
This initiative comes at a critical juncture in AI development. The technology has become increasingly embedded in critical operations, yet half of UK businesses have experienced a cyberattack in the past year. Establishing a robust security framework has, therefore, become essential for maintaining trust of AI as a transformative technology.
A framework built upon 13 principles
The code is underpinned by 13 software development principles for developers to follow throughout the entire AI system lifecycle. Unlike general software security standards, the code specifically addresses unique AI vulnerabilities including data poisoning, model obfuscation, and indirect prompt injection.
The code will serve as the foundation for a new global standard through the European Telecommunications Standards Institute (ETSI), potentially extending the UK's influence in international AI governance.
A contrasting approach
The UK’s approach contrasts notably with the European Union’s more prescriptive AI Act. Instead of comprehensive legislation, the UK has opted for a principles-based, cross-sector framework that applies existing technology-neutral regulations to AI. This reflects the government’s assessment that while legislative action will ultimately be necessary, particularly regarding general purpose AI systems, acting now would be premature.
This approach aligns with the UK’s broader AI strategy outlined in the AI Opportunities Action Plan, emphasising a pro-innovation regulatory environment designed to attract technology investment while addressing essential security concerns.
The standard supports the UK’s ambition to become a global AI leader. The sector currently comprises over 3,100 AI companies employing more than 50,000 people and contributing £3.7 billion to the economy. The recently launched AI Opportunities Action Plan aims to boost these figures significantly, potentially adding £47 billion annually by increasing productivity up to 1.5% each year.
Not a moment too soon
The code of practice has come in not a moment too soon. Recent research highlights that approximately 8.5% of employee prompts to popular AI tools contain sensitive information. Of this, customer data accounted for 45% of sensitive information shared, followed by employee data (26%) and legal and financial information (15%). Most concerning for security professionals, nearly 7% of sensitive prompts contained security-related information including penetration test results, network configurations, and incident reports − essentially providing potential attackers with blueprints for exploitation. This is clearly a significant security, compliance, and legal vulnerability for organisations that needs to urgently be plugged.
For businesses looking to implement the standard, the government has published a comprehensive implementation guide to help them determine which requirements apply and provide practical steps for achieving compliance. The guide emphasises the critical importance of advanced governance tracking and AI data gateways to prevent sensitive information exposure to public GenAI models. A necessity highlighted by recent data leakage incidents. Such controls should monitor all AI interactions from employees, contractors, and third parties who might inadvertently share proprietary information.
Fostering innovation
The UK’s new AI standards represent a balanced approach to fostering innovation while addressing security concerns. By providing frameworks that are both comprehensive and flexible, it will help build trust in AI systems and unlock the potential economic benefits.
As AI continues its rapid evolution, a strategic approach favouring guidance and principles over rigid legislation offers businesses the adaptability needed to innovate responsibly. The success of these standards will ultimately depend on their adoption across sectors and how effectively they evolve to address emerging challenges in the increasingly complex AI landscape.
Website: https://www.kiteworks.com/
LinkedIn: https://www.linkedin.com/company/kiteworkscgcp/
Cyber Resilience Programme activities
techUK brings together key players across the cyber security sector to promote leading-edge UK capabilities, build networks and grow the sector. techUK members have the opportunity to network, share ideas and collaborate, enabling the industry as a whole to address common challenges and opportunities together. Visit the programme page here.
Upcoming events
Latest news and insights
President's Awards 2025 - Nominations Open!
Do you have a trailblazer in your team?
Do you work with an innovator or a problem solver?
Do you have an inspirational colleague who deserves the spotlight for their work? The President’s Awards are back for 2025 and open for nominations. All techUK members are encouraged to nominate one colleague.
Learn more and nominate
Learn more and get involved
Cyber Resilience updates
Sign-up to get the latest updates and opportunities from our Cyber Resilience programme.
Meet the team
Jill Broom
Head of Cyber Resilience, techUK
Jill leads the techUK Cyber Security programme, having originally joined techUK in October 2020 as a Programme Manager for the Cyber and Central Government programmes. She is responsible for managing techUK's work across the cyber security ecosystem, bringing industry together with key stakeholders across the public and private sectors. Jill also provides the industry secretariat for the Cyber Growth Partnership, the industry and government conduit for supporting the growth of the sector. A key focus of her work is to strengthen the public–private partnership across cyber to support further development of UK cyber security and resilience policy.
Before joining techUK, Jill worked as a Senior Caseworker for an MP, advocating for local communities, businesses and individuals, so she is particularly committed to techUK’s vision of harnessing the power of technology to improve people’s lives. Jill is also an experienced editorial professional and has delivered copyediting and writing services for public-body and SME clients as well as publishers.
- Email:
- [email protected]
- Website:
- www.techuk.org/
- LinkedIn:
- https://www.linkedin.com/in/jill-broom-19aa824
Read lessmore
Annie Collings
Programme Manager, Cyber Resilience, techUK
Annie is the Programme Manager for Cyber Resilience at techUK. She first joined as the Programme Manager for Cyber Security and Central Government in September 2023.
In her role, Annie supports the Cyber Security SME Forum, engaging regularly with key government and industry stakeholders to advance the growth and development of SMEs in the cyber sector. Annie also coordinates events, engages with policy makers and represents techUK at a number of cyber security events.
Before joining techUK, Annie was an Account Manager at a specialist healthcare agency, where she provided public affairs support to a wide range of medical technology clients. She also gained experience as an intern in both an MP’s constituency office and with the Association of Independent Professionals and the Self-Employed. Annie holds a degree in International Relations from Nottingham Trent University.
- Email:
- [email protected]
- Twitter:
- anniecollings24
- LinkedIn:
- https://www.linkedin.com/in/annie-collings-270150158/
Read lessmore
Tracy Modha
Team Assistant - Markets, techUK
Tracy supports several areas at techUK, including Cyber Exchange, Cyber Security, Defence, Health and Social Care, Local Public Services, Nations and Regions and National Security.
Authors
John Lynch
Director, Kiteworks