Londonchiropracter.com

This domain is available to be leased

Menu
Menu

LLMs prone to data poisoning and prompt injection risks, UK authority warns

Posted on August 31, 2023 by admin

The UK’s National Cyber Security Centre (NCSC) is warning organisations to be wary of the imminent cyber risks associated with the integration of Large Language Models (LLMs) — such as ChatGPT — into their business, products, or services.

In a set of blog posts, the NCSC emphasised that the global tech community doesn’t yet fully grasp LLMs’ capabilities, weaknesses, and (most importantly) vulnerabilities. “You could say our understanding of LLMs is still ‘in beta’,’’ the authority said.

One of the most extensively reported security weaknesses of existing LLMs is their susceptibility to malicious “prompt injection” attacks. These occur when a user creates an input aimed at causing the AI model to behave in an unintended way — such as generating offensive content or disclosing confidential information.

In addition, the data LLMs are trained on poses a twofold risk. Firstly a vast amount of this data is collected from the open internet, meaning it can include content that’s inaccurate, controversial, or biased.

Catch up on our conference talks

Watch videos of our past talks for free with TNW All Access →

Secondly, cyber criminals can not only distort the data available for malicious practices (also known as “data poisoning”), but also use it to conceal prompt injection attacks. This way, for example, a bank’s AI-assistant for account holders can be tricked into transferring money to the attackers.

“The emergence of LLMs is undoubtedly a very exciting time in technology – and a lot of people and organisations (including the NCSC) want to explore and benefit from it,” said the authority.

“However, organisations building services that use LLMs need to be careful, in the same way they would be if they were using a product or code library that was in beta,” the NCSC added. That is, with caution.

The UK authority is urging organisations to establish cybersecurity principles and ensure that even the “worst case scenario” of whatever their LLM-powered applications are permitted to do is something they can deal with.

Source

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

  • Ending graciously
  • How robotics could turn e-waste into a tech goldmine
  • Startup wisdom: 5 prompt engineering tips for vibe coding success
  • How European battery startups can thrive alongside Asian giants
  • The EU’s €2T budget overlooks a key tech pillar: Open source

Recent Comments

    Archives

    • September 2025
    • August 2025
    • July 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • August 2024
    • July 2024
    • June 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • May 2023
    • April 2023
    • March 2023
    • February 2023
    • January 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022
    • July 2022
    • June 2022
    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • December 2021
    • November 2021
    • October 2021
    • September 2021
    • August 2021
    • July 2021
    • June 2021
    • May 2021
    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020

    Categories

    • Uncategorized

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    ©2025 Londonchiropracter.com | Design: Newspaperly WordPress Theme