April 13, 2025 06:29 pm (IST)
Follow us:
facebook-white sharing button
twitter-white sharing button
instagram-white sharing button
youtube-white sharing button
15 flights diverted, many delayed as dust storm hits Delhi, Haryana | AIADMK, BJP join hands again to contest Tamil Nadu elections under Edappadi K Palaniswami | PM Modi inaugurates Rs. 3,880-cr projects in Varanasi on 50th visit to his Lok Sabha constituency | Bengal job losers camp outside SSC office in Kolkata, demand mirror copies of genuine candidate list | Mumbai terror attack accused Tahawwur Rana sent to 18-day NIA custody | Donald Trump's latest tariff hike on Beijing brings additional rate on some Chinese goods to 145 pct: White House | Pakistan distances itself from 26/11 terror accused Tahawwur Rana, says he is 'Canadian national' | Tahawwur Rana’s extradition proof of Modi govt's diplomatic strength: Amit Shah | Adult unmarried parents can live together without marriage: Allahabad High Court | Bengal job losers hold massive rally in Kolkata protesting over police assault
ChatGPT
Pixabay

British security agency warns about possible dangers associated with AI chatbots

| @indiablooms | Mar 17, 2023, at 02:02 pm

London: A leading security agency in the UK has warned people about the potential hazards associated with Large language models (LLMs) and AI chatbots.

The use of these LLMs became popular in recent times and captured the attention of the world.

" It's now one of the fastest growing consumer applications ever, and its popularity is leading many competitors to develop their own services and models, or to rapidly deploy those that they’ve been developing internally.As with any emerging technology, there's always concern around what this means for security," National Cyber Security Centre mentioned in a blog post.

The blog cautioned netizens: " LLMs are undoubtedly impressive for their ability to generate a huge range of convincing content in multiple human and computer languages. However, they’re not magic, they’re not artificial general intelligence, and contain some serious flaws."

The  UK security body warned people that the tools  can get things wrong and ‘hallucinate’ incorrect facts.

"They can be biased, are often gullible (in responding to leading questions, for example)," mentioned another point of caution.

"They require huge compute resources and vast data to train from scratch," read another instruction.

"They can be coaxed into creating toxic content and are prone to ‘injection attacks’," the security body mentioned.

"A question might be sensitive because of data included in the query, or because who is asking the question (and when). Examples of the latter might be if a CEO is discovered to have asked 'how best to lay off an employee?', or somebody asking revealing health or relationship questions. Also bear in mind aggregation of information across multiple queries using the same login," the security body said.

Warning about possible data leak, the body said, "Another risk, which increases as more organisations produce LLMs, is that queries stored online may be hacked, leaked, or more likely accidentally made publicly accessible. This could include potentially user-identifiable information."

"A further risk is that the operator of the LLM is later acquired by an organisation with a different approach to privacy than was true when data was entered by users," read the post.

NCSC recommended netizens to follow the following two steps:

Do not to include sensitive information in queries to public LLMs.

Do not to submit queries to public LLMs that would lead to issues were they made public.

Support Our Journalism

We cannot do without you.. your contribution supports unbiased journalism

IBNS is not driven by any ism- not wokeism, not racism, not skewed secularism, not hyper right-wing or left liberal ideals, nor by any hardline religious beliefs or hyper nationalism. We want to serve you good old objective news, as they are. We do not judge or preach. We let people decide for themselves. We only try to present factual and well-sourced news.

Support objective journalism for a small contribution.
Related Images
Milan Fashion Week: The Street Style Feb 27, 2025, at 02:23 pm
London Fashion Week: Tifaret Collection Feb 26, 2025, at 01:53 pm
Close menu