• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
Tuesday, July 29, 2025
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News Science

Anonymizing personal data ‘not enough to protect privacy,’ shows new study

Bioengineer by Bioengineer
July 23, 2019
in Science
Reading Time: 3 mins read
0
ADVERTISEMENT
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

With the first large fines for breaching EU General Data Protection Regulation (GDPR) regulations upon us, and the UK government about to review GDPR guidelines, researchers have shown how even anonymised datasets can be traced back to individuals using machine learning.

The researchers say their paper, published today in Nature Communications, demonstrates that allowing data to be used – to train AI algorithms, for example – while preserving people’s privacy, requires much more than simply adding noise, sampling datasets, and other de-identification techniques.

They have also published a demonstration tool (3) that allows people to understand just how likely they are to be traced, even if the dataset they are in is anonymised and just a small fraction of it shared.

They say their findings should be a wake-up call for policymakers on the need to tighten the rules for what constitutes truly anonymous data.

Companies and governments both routinely collect and use our personal data. Our data and the way it’s used is protected under relevant laws like GDPR or the US’s California Consumer Privacy Act (CCPA).

Data is ‘sampled’ and anonymised, which includes stripping the data of identifying characteristics like names and email addresses, so that individuals cannot, in theory, be identified. After this process, the data’s no longer subject to data protection regulations, so it can be freely used and sold to third parties like advertising companies and data brokers.

The new research shows that once bought, the data can often be reverse engineered using machine learning to re-identify individuals, despite the anonymisation techniques.

This could expose sensitive information about personally identified individuals, and allow buyers to build increasingly comprehensive personal profiles of individuals.

The research demonstrates for the first time how easily and accurately this can be done – even with incomplete datasets.

In the research, 99.98 per cent of Americans were correctly re-identified in any available ‘anonymised’ dataset by using just 15 characteristics, including age, gender, and marital status.

First author Dr Luc Rocher of UCLouvain said: “While there might be a lot of people who are in their thirties, male, and living in New York City, far fewer of them were also born on 5 January, are driving a red sports car, and live with two kids (both girls) and one dog.”

To demonstrate this, the researchers developed a machine learning model to evaluate the likelihood for an individual’s characteristics to be precise enough to describe only one person in a population of billions.

They also developed an online tool, which doesn’t save data and is for demonstration purposes only, to help people see which characteristics make them unique in datasets.

The tool first asks you put in the first part of their post (UK) or ZIP (US) code, gender, and date of birth, before giving them a probability that their profile could be re-identified in any anonymised dataset.

It then asks your marital status, number of vehicles, house ownership status, and employment status, before recalculating. By adding more characteristics, the likelihood of a match to be correct dramatically increases.

Senior author Dr Yves-Alexandre de Montjoye, of Imperial’s Department of Computing, and Data Science Institute, said: “This is pretty standard information for companies to ask for. Although they are bound by GDPR guidelines, they’re free to sell the data to anyone once it’s anonymised. Our research shows just how easily – and how accurately – individuals can be traced once this happens.

He added: “Companies and governments have downplayed the risk of re-identification by arguing that the datasets they sell are always incomplete.

“Our findings contradict this and demonstrate that an attacker could easily and accurately estimate the likelihood that the record they found belongs to the person they are looking for.”

Re-identifying anonymised data is how journalists exposed Donald Trump’s 1985-94 tax returns in May 2019. (4)

Co-author Dr Julien Hendrickx from UCLouvain said: “We’re often assured that anonymisation will keep our personal information safe. Our paper shows that de-identification is nowhere near enough to protect the privacy of people’s data.”

The researchers say policymakers must do more to protect individuals from such attacks, which could have serious ramifications for careers as well as personal and financial lives.

Dr Hendrickx added: “It is essential for anonymisation standards to be robust and account for new threats like the one demonstrated in this paper.”

Dr de Montjoye said: “The goal of anonymisation is so we can use data to benefit society. This is extremely important but should not and does not have to happen at the expense of people’s privacy.”

###

Media Contact
Caroline Brogan
[email protected]

Tags: Computer ScienceGuidelines/Treaties/AgreementsSystem Security/HackersTechnology/Engineering/Computer ScienceTheory/Design
Share12Tweet8Share2ShareShareShare2

Related Posts

Five or more hours of smartphone usage per day may increase obesity

July 25, 2019
IMAGE

NASA’s terra satellite finds tropical storm 07W’s strength on the side

July 25, 2019

NASA finds one burst of energy in weakening Depression Dalila

July 25, 2019

Researcher’s innovative flood mapping helps water and emergency management officials

July 25, 2019
Please login to join discussion

POPULAR NEWS

  • Blind to the Burn

    Overlooked Dangers: Debunking Common Myths About Skin Cancer Risk in the U.S.

    56 shares
    Share 22 Tweet 14
  • USF Research Unveils AI Technology for Detecting Early PTSD Indicators in Youth Through Facial Analysis

    42 shares
    Share 17 Tweet 11
  • Dr. Miriam Merad Honored with French Knighthood for Groundbreaking Contributions to Science and Medicine

    45 shares
    Share 18 Tweet 11
  • Engineered Cellular Communication Enhances CAR-T Therapy Effectiveness Against Glioblastoma

    35 shares
    Share 14 Tweet 9

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

Cracking the Code of Cancer Drug Resistance

Peptidoglycan Links Prevent Lysis in Gram-Negative Bacteria

Novel Plasma Synuclein Test Advances Parkinson’s Diagnosis

  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.