• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
Tuesday, October 21, 2025
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News

Teaching AI what’s fair

Bioengineer by Bioengineer
March 22, 2021
in Science News
Reading Time: 4 mins read
0
IMAGE
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

With support from Amazon and the National Science Foundation, Michigan State researchers are helping artificial intelligence understand fairness

IMAGE

Credit: Creative commons via Pexels

“What is fair?” feels like a rhetorical question. But for Michigan State University’s Pang-Ning Tan, it’s a question that demands an answer as artificial intelligence systems play a growing role in deciding who gets proper health care, a bank loan or a job.

With funding from Amazon and the National Science Foundation, Tan has been working for the last year to teach artificial intelligence algorithms how to be more fair and recognize when they’re being unfair.

“We’re trying to design AI systems that aren’t just for computer science, but also bring value and benefits to society. So I started thinking about what are the areas that are really challenging to society right now,” said Tan, a professor in MSU’s Department of Computer Science and Engineering.

“Fairness is a very big issue, especially as we become more reliant on AI for everyday needs, like health care, but also things that seem mundane, like spam filtering or putting stories in your news feed.”

As Tan mentioned, people already trust AI in a variety of applications and the consequences of unfair algorithms can be profound.

For example, investigations have revealed that AI systems have made it harder for Black patients to access health care resources. And Amazon scrapped an AI recruiting tool that penalized job applicants who were women in favor of men.

Tan’s research team is contending with such problems on multiple fronts. The Spartans are looking at how people use data to teach their algorithms. They’re also investigating ways to give algorithms access to more diverse information when making decisions and recommendations. And their work with the NSF and Amazon is attempting to broaden the way fairness has usually been defined for AI systems.

A conventional definition would look at fairness from the perspective of an individual; that is, whether one person would see a particular outcome as fair or unfair. It’s a sensible start, but it also opens the door for conflicting or even contradictory definitions, Tan said. What’s fair to one person can be unfair to another.

So Tan and his research team are borrowing ideas from social science to build a definition that includes perspectives from groups of people.

“We’re trying to make AI aware of fairness and to do that, you need to tell it what is fair. But how do you design a measure of fairness that is acceptable to all,” Tan said. “We’re looking at how does a decision affect not only individuals, but their communities and social circles as well.”

Consider this simple example: Three friends with identical credit scores apply for loans worth the same amount of money from the same bank. If the bank approves or denies everyone, the friends would perceive that as more fair than a case where only one person is approved or denied. That could indicate that the bank used extraneous factors that the friends might deem unjust.

Tan’s team is building a way to essentially score or quantify the fairness of different outcomes so AI algorithms can identify the most fair options.

Of course, the real world is much more complex than this example, and Tan is the first to admit that defining fairness for AI is easier said than done. But he has help — including from the chair of his department at MSU, Abdol-Hossein Esfahanian.

Esfahanian is an expert in a field known as applied graph theory that helps model connections and relationships. He also loves learning about related fields in computer science and has been known to sit in on classes taught by his colleagues, as long as they’re comfortable having him there.

“Our faculty are fantastic in imparting knowledge,” Esfahanian said. “I needed to learn more about data mining, and so I sat in one Dr. Tan’s courses for a semester. From that point on, we started communicating about research problems.”

Now, Esfahanian is a co-investigator on the NSF and Amazon grant.

“Algorithms are created by people and people typically have biases, so those biases seep in,” he said. “We want to have fairness everywhere, and we want to have a better understanding of how to evaluate it.”

The team is making progress on that front. This past November, they presented their work at an online meeting organized by NSF and Amazon as well as at a virtual international conference hosted by the Institute of Electrical and Electronics Engineers.

Both Tan and Esfahanian said the community — and the funders — are excited by the Spartans’ progress. But both researchers also acknowledged that they’re just getting started.

“This is very much ongoing research. There are a lot of issues and challenges. How do you define fairness? How can you help people trust these systems that we use every day?” Tan said. “Our job as researchers is to come up with solutions to these problems.”

###

Media Contact
Caroline Brooks
[email protected]

Original Source

https://msutoday.msu.edu/news/2021/teaching-ai-fairness

Tags: Algorithms/ModelsComputer ScienceInternetResearch/DevelopmentRobotry/Artificial IntelligenceSoftware EngineeringTechnology/Engineering/Computer Science
Share12Tweet8Share2ShareShareShare2

Related Posts

AI Model Predicts Urosepsis Post-Surgery

October 21, 2025
Metaproteomics Reveals Key Rare Bacteria in Anaerobic Metabolism

Metaproteomics Reveals Key Rare Bacteria in Anaerobic Metabolism

October 21, 2025

Microbial Indole-3-Propionic Acid Boosts T Cell Mitochondria

October 21, 2025

Enhancing Bacillus Survival in Rice Husk Biochar

October 21, 2025
Please login to join discussion

POPULAR NEWS

  • Sperm MicroRNAs: Crucial Mediators of Paternal Exercise Capacity Transmission

    1269 shares
    Share 507 Tweet 317
  • Stinkbug Leg Organ Hosts Symbiotic Fungi That Protect Eggs from Parasitic Wasps

    302 shares
    Share 121 Tweet 76
  • ESMO 2025: mRNA COVID Vaccines Enhance Efficacy of Cancer Immunotherapy

    130 shares
    Share 52 Tweet 33
  • New Study Suggests ALS and MS May Stem from Common Environmental Factor

    130 shares
    Share 52 Tweet 33

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

AI Model Predicts Urosepsis Post-Surgery

Metaproteomics Reveals Key Rare Bacteria in Anaerobic Metabolism

Microbial Indole-3-Propionic Acid Boosts T Cell Mitochondria

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 66 other subscribers
  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.