• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
Monday, October 13, 2025
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News

The risk of being discriminated by the algorithm

Bioengineer by Bioengineer
November 13, 2019
in Science News
Reading Time: 4 mins read
0
IMAGE
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

KIT study identifies many possibilities of unequal treatment and recommends preventative measures

IMAGE

Credit: Photo: Patrick Langer, KIT


Not only companies, also state institutions increasingly rely on automated decisions by algorithm-based systems. Their efficiency saves time and money, but also entails many risks of individuals or population groups being discriminated. This is the result of a study made by the Institute for Technology Assessment and Systems Analysis (ITAS) of Karlsruhe Institute of Technology (KIT) on behalf of the Federal Anti-Discrimination Agency.

When granting a loan, selecting new staff members, or making legal decisions – in an increasing number of sectors, algorithms are applied to prepare human decisions or to make these decisions for humans. “Unfortunately, it often is a mistake to think that this inevitably leads to more objective and fairer decisions,” says Carsten Orwat of the Institute for Technology Assessment and Systems Analysis (ITAS) of KIT. “Situations become particularly critical when algorithms work with biased data and rely on criteria that ought to be protected,” the author says. These criteria include, in particular, the age, gender, ethnic origin, religion, sexual orientation, and handicaps.

On behalf of the Federal Anti-Discrimination Agency, Carsten Orwat studied in detail the causes of discrimination, its impact on society, and future options to reduce discrimination risks. His study entitled “Diskriminierungsrisiken durch Verwendung von Algorithmen” (discrimination risks by using algorithms) lists 47 examples to illustrate how algorithms can discriminate people in various ways and how this can be detected and proved.

Real Estates, Loans, Judicial Matters, and More: Various Examples of Discrimination Risks

As examples, Orwat describes situations on the real estate and loan markets or in the court system. In the USA, for instance, several cases have been documented, in which algorithms of social media permitted targetted advertisements to be invisible to persons protected by the “Fair Housing Act,” such as migrants, people with handicaps, or with non-white skin color, the author says. In Finland, a bank was sentenced to pay a fine, because its algorithm for the automatic granting of online loans discriminated men against women and Finnish against Swedish native speakers. This unequal treatment is forbidden by Finnish anti-discrimination law. When deciding on early releases from prison, US judges use a much disputed system that calculates risk scores. Journalists and human rights associations criticize the fact that this system systematically overestimates black people’s risk of re-offending.

“Machine learning systems often have problems when they are trained with data reflecting unequal treatments or stereotypes,” Carsten Orwat explains. “In this case, the algorithms generated will also do so. When processing data containing evaluations of people by other people, unequal treatments and discriminations may even spread or increase.” This happened in the USA in a system for food and health controls that was based on discriminating ratings of restaurants.

Recommendations of Countermeasures

However, society must no longer accept these unequal treatments. The study lists several options to counteract discrimination by algorithms. “Preventative measures appear to be most reasonable,” Carsten Orwat says. Companies may ask antidiscrimination agencies to instruct their staff and IT experts and increase their awareness. Then, these persons will use datasets that do not reflect any discriminating practices or unequal treatments.

According to Orwat, the goal is to make future algorithms “discrimination-free by design.” This means that programs have to be checked during their development already. In the end, it is all about the protection of the society’s values, such as equality or free development of the personality. To guarantee this in spite of the very rapid developments of “big data” and AI, it is necessary to improve anti-discrimination and data protection legislation at some points, Orwat points out.

###

The complete study in PDF format for download (in German):

https://www.antidiskriminierungsstelle.de/SharedDocs/Downloads/DE/publikationen/Expertisen/Studie_Diskriminierungsrisiken_durch_Verwendung_von_Algorithmen.html

Press contact:

Jonas Moosmüller

Public Relations ITAS, KIT

Phone: +49 721 608-26796

[email protected]

Being “The Research University in the Helmholtz Association,” KIT creates and imparts knowledge for the society and the environment. It is the objective to make significant contributions to the global challenges in the fields of energy, mobility and information. For this, about 9,300 employees cooperate in a broad range of disciplines in natural sciences, engineering sciences, economics, and the humanities and social sciences. KIT prepares its 25,100 students for responsible tasks in society, industry, and science by offering research-based study programs. Innovation efforts at KIT build a bridge between important scientific findings and their application for the benefit of society, economic prosperity, and the preservation of our natural basis of life.

This press release is available on the internet at http://www.sek.kit.edu/english/press_office.php.

Media Contact
Monika Landgraf
[email protected]
49-721-608-21105

Original Source

https://www.kit.edu/kit/english/pi_2019_135_the-risk-of-being-discriminated-by-the-algorithm.php

Tags: Algorithms/ModelsTechnology/Engineering/Computer Science
Share12Tweet8Share2ShareShareShare2

Related Posts

blank

Revolutionizing Signal Processing: The Traveling-Wave Amplifier

October 13, 2025

Mobile Health Boosts Clinic Attendance for HIV Patients

October 13, 2025

Discover Mutactimycins H-J: Antimycobacterial Treasures Uncovered!

October 13, 2025

New Lung-on-a-Chip Model Simulates Severe Influenza

October 13, 2025
Please login to join discussion

POPULAR NEWS

  • Sperm MicroRNAs: Crucial Mediators of Paternal Exercise Capacity Transmission

    1234 shares
    Share 493 Tweet 308
  • New Study Reveals the Science Behind Exercise and Weight Loss

    104 shares
    Share 42 Tweet 26
  • New Study Indicates Children’s Risk of Long COVID Could Double Following a Second Infection – The Lancet Infectious Diseases

    101 shares
    Share 40 Tweet 25
  • Revolutionizing Optimization: Deep Learning for Complex Systems

    91 shares
    Share 36 Tweet 23

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

Revolutionizing Signal Processing: The Traveling-Wave Amplifier

Mobile Health Boosts Clinic Attendance for HIV Patients

Discover Mutactimycins H-J: Antimycobacterial Treasures Uncovered!

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 64 other subscribers
  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.