House of Lords inquiry report on LLMs released
Today, the House of Lords Communications and Digital Committee released its report into Large Language Models (LLMs).
The inquiry’s call for evidence ran between July-September 2023 with the aim to address calls for safeguards, standards and regulatory approaches to promote innovation, whilst managing risks.
The detailed Committee’s report addresses a number of topics including: possible trends in the development of LLMs over the next three years, issues in relation to open and closed models, risks and opportunities related to AI adoption, the Government’s AI White Paper approach, copyright and the international context. The following is a brief summary of the reports key recommendations.
Key recommendations
- As the UK seeks to take advantage of LLM opportunities, it must prepare for a period of "international competition and technological turbulence”
- Market competition should be an "explicit AI policy objective” of the Government
- A nuanced approach is needed to the issue of open and closed models. Government must review security implications at pace while ensuring that new rules “support rather than stifle competition”
- The Government’s focus on AI must be “rebalanced” to focus on the opportunities from LLMs to realise opportunities, measures are needed to boost computing power and infrastructure, skills and support for academic spinouts
- Fairness and responsible innovation should be a priority for the Government and disputes on copyright issues must be resolved “definitively”
- To address immediate risks faster mitigations are needed in cyber security, counter terror, child sexual abuse material and disinformation
- While catastrophic risks “cannot be ruled out” but there are no “agreed warning indicators” as yet. Concerns about existential risks “should not distract policymakers from immediate priorities”. Mandatory safety tests for high-risk and high impact models are seen as needed
- Given the focus on sector regulators to deliver the AI White Paper approach they must be given tools needed. There are calls for “speedier resourcing of Government-lead central support teams” along with “investigator and sanctioning powers for some regulators, cross-sector guidelines, and a legal review of liability”
- The UK should “regulate proportionately” and chart its own course on AI regulation. An immediate priority is to develop “accredited standards and common auditing methods to ensure responsible innovation, support business adoption, and enable sound regulatory oversight”
You can read the committee’s full report here.
techUK - Putting AI into Action
techUK’s Putting AI into Action campaign serves as a one stop shop for showcasing the opportunities and benefits of AI adoption across sectors and markets.
During this campaign, techUK will run a regular drumbeat of activity, including events, reports, and insights, to demonstrate some of the most significant opportunities for AI adoption in 2024, as well as working with key stakeholders to identify and address current barriers to adoption.
Visit our AI Adoption Hub to learn more, or find our latest activity below.
Upcoming AI Adoption events
Latest news and insights
Get our tech and innovation insights straight to your inbox
Sign-up to get the latest updates and opportunities from our Technology and Innovation and AI programmes.
Contact the team
Learn more about our AI Adoption campaign: