10 Apr 2024
by Santo Orlando

What’s all the fuss about ChatGPT

OpenAI’s GPT, like Google’s PaLM 2 and Meta’s LLaMA, is an example of a large language model (LLM). It’s an advanced artificial intelligence system designed to understand and generate human-like language. At great expense and resource, these models have been ‘trained’ on vast amounts of data to perform tasks - like answering questions, generating text, translating text and more.

For example, GPT-4, which was released in March 2023, is widely reported to have been trained on significantly more data than its predecessor, GPT- , which was trained on 45 terabytes of text data.

This enables it to understand and create contextually appropriate responses - making it valuable for a variety of applications like virtual assistants and content generation.

Among the consumer-facing generative AI applications that have been launched, ChatGPT, developed by OpenAI, is trained on the GPT model, whilst Google Bard is trained on the PalM 2 model. Both are particularly suited to language processing and can generate text, images or audio in response to specific prompts.

However, despite their enormous ‘knowledge’, the results that LLMs provide will only ever be based on their pre- trained models. For example, ChatGPT’s training data only includes sources up until September 2021. If you asked it to name the current Prime Minister, it would still believe Boris Johnson is in post – although it will caveat that with a line about verifying this with up-to-date sources.

What’s more, most do not have access to real time information, so they are unable to tell you the current time.

So, how can you trust what GPT tells you?

There are always challenges when it comes to the adoption of revolutionary new technologies. And one of the most important considerations when it comes to deploying generative AI is its ability to tell the truth. To put it simply. LLMs provide information, but they're incapable of deciphering what is true and what is not. Any organisation needs to trust that the response they are giving to the end user is accurate. So, when it comes to generative AI applications there needs to be a process for testing the truthfulness, appropriateness, and bias of the information.

For example, if a consumer uses a generative AI chatbot, hosted on a retailer’s website, to ask about how to self- harm, the response can’t be a fact-based response that provides the person with advice on how to cause harm to themselves. While the response is truthful, it would be wholly inappropriate and clearly create several issues for the organisation. 

So how can GPT be useful to an organisation?

"It is important to recognise that when we talk about how an organisation can deploy generative AI, we are not talking about simply deploying ChatGPT or Bard".

These examples demonstrate the limitations of consumer-facing generative AI applications in providing valuable and accurate responses. While they may be valuable to the public, they can be damaging when it comes to decision making. And these models will only ‘learn’ more once the developer has invested in retraining them, or other data sources have been embedded or indexed.

So, it is important to recognise that when we talk about how an organisation can deploy generative AI, we are not talking about simply deploying ChatGPT or Bard. In almost all circumstances we’re also not talking about creating brand new models and training them on new data – something that would be cost-prohibitive to most organisations. As the leading Solutions Integrator, we are harnessing the power of existing LLMs, like GPT-4 (its latest model), and embedding indexed enterprise data to deliver more reliable, trustworthy and accurate responses that meet an organisation’s demands.

In the case of the user that asked about self-harm. Indexing the retailer’s own data and embedding it in a LLM would lead to a very different outcome. It could mean that the question is intercepted immediately and flagged to the police or a list of help line phone numbers could be provided. Rather than the fact-based answer that the model learnt during its training.

At this point it is prudent to touch upon data security, because if you are embedding your valuable data into a LLM you need to know it is secure. We’ve already heard stories of employees using ChatGPT to check coding and, in doing so, unwittingly giving away corporate IP, which is why using these consumer-facing versions is simply not advisable. Running your data parallel to the model will enable you to keep your IP secure, while delivering a much more accurate response from the application.

How Insight can help

As Insight focuses on digital transformation in the public sector, it is natural to consider the role that artificial intelligence and large Language Models in particular will play. Insight can work with organisations to help them understand the advantages of incorporating generative AI into their organisation safely. We are already delivering customer workshops to discuss many of the issues raised in this article; we can also work directly with individual, existing or potential clients to help them navigate the generative AI maze effectively.

Learn more about how to get your organisation AI ready. https://uk.insight.com/en_GB/what-we-do/campaigns/generative-ai-roadshow.html


Heather Cover-Kus

Heather Cover-Kus

Head of Central Government Programme, techUK

Heather is Head of Central Government Programme at techUK, working to represent the supplier community of tech products and services to Central Government.

Prior to joining techUK in April 2022, Heather worked in the Economic Policy and Small States Section at the Commonwealth Secretariat.  She led the organisation’s FinTech programme and worked to create an enabling environment for developing countries to take advantage of the socio-economic benefits of FinTech.

Before moving to the UK, Heather worked at the Office of the Prime Minister of The Bahamas and the Central Bank of The Bahamas.

Heather holds a Graduate Diploma in Law from BPP, a Masters in Public Administration (MPA) from LSE, and a BA in Economics and Sociology from Macalester College.

Email:
[email protected]
LinkedIn:
https://www.linkedin.com/in/heather-cover-kus-ba636538

Read lessmore

Ellie Huckle

Ellie Huckle

Programme Manager, Central Government, techUK

Ellie joined techUK in March 2018 as a Programme Assistant to the Public Sector team and now works as a Programme Manager for the Central Government Programme.

The programme represents the supplier community of technology products and services in Central Government – in summary working to make Government a more informed buyer, increasing supplier visibility in order to improve their chances of supplying to Government Departments, and fostering better engagement between the public sector and industry. To find out more about what we do, how we do this and how you can get involved – make sure to get in touch!

Prior to joining techUK, Ellie completed Sixth Form in June 2015 and went on to work in Waitrose, moved on swiftly to walking dogs and finally, got an office job working for a small local business in North London, where she lives with her family and their two Bengal cats Kai and Nova.

When she isn’t working Ellie likes to spend time with her family and friends, her cats, and enjoys volunteering for diabetes charities. She has a keen interest in writing, escaping with a good book and expanding her knowledge watching far too many quiz shows!

Email:
[email protected]
Phone:
020 7331 2015
Twitter:
@techUK,@techUK
Website:
www.techuk.org,www.techuk.org
LinkedIn:
https://bit.ly/3mtQ7Jx,https://bit.ly/3mtQ7Jx

Read lessmore

Annie Collings

Annie Collings

Programme Manager, Cyber Resilience, techUK

Annie is the Programme Manager for Cyber Resilience at techUK. She first joined as the Programme Manager for Cyber Security and Central Government in September 2023. 

In her role, Annie supports the Cyber Security SME Forum, engaging regularly with key government and industry stakeholders to advance the growth and development of SMEs in the cyber sector. Annie also coordinates events, engages with policy makers and represents techUK at a number of cyber security events.

Before joining techUK, Annie was an Account Manager at a specialist healthcare agency, where she provided public affairs support to a wide range of medical technology clients. She also gained experience as an intern in both an MP’s constituency office and with the Association of Independent Professionals and the Self-Employed. Annie holds a degree in International Relations from Nottingham Trent University.

Email:
[email protected]
Twitter:
anniecollings24
LinkedIn:
https://www.linkedin.com/in/annie-collings-270150158/

Read lessmore

Austin Earl

Austin Earl

Programme Manager, Central Government, techUK

Austin joined techUK’s Central Government team in March 2024 to launch a workstream within Education and EdTech.

With a career spanning technology, policy, media, events and comms, Austin has worked with technology communities, as well as policy leaders and practitioners in Education, Central and Local Government and the NHS.

Cutting his teeth working for Skills Matter, London’s developer community hub, Austin then moved to GovNet Communications where he launched Blockchain Live and the Cyber Security and Data Protection Summit. For the last 3 years he has worked with leaders in Education across the state and independent schools sectors, from primary up to higher education, with a strong research interest in technology and education management.

Email:
[email protected]
Phone:
07891 743 932
Website:
www.techuk.org,www.techuk.org
LinkedIn:
https://www.linkedin.com/in/austin-spencer-earl/,https://www.linkedin.com/in/austin-spencer-earl/

Read lessmore

Ella Gago-Brookes

Ella Gago-Brookes

Team Assistant, Markets, techUK

Ella joined techUK in November 2023 as a Markets Team Assistant, supporting the Justice and Emergency Services, Central Government and Financial Services Programmes.  

Before joining the team, she was working at the Magistrates' Courts in legal administration and graduated from the University of Liverpool in 2022.  Ella attained an undergraduate degree in History and Politics, and a master's degree in International Relations and Security Studies, with a particular interest in studying asylum rights and gendered violence.  

In her spare time she enjoys going to the gym, watching true crime documentaries, travelling, and making her best attempts to become a better cook.  

Email:
[email protected]

Read lessmore

 

Authors

Santo Orlando

Santo Orlando

EMEA Practice Director for Apps and Data Services