• HOME
  • NEWS
    • BIOENGINEERING
    • SCIENCE NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • FORUM
    • INSTAGRAM
    • TWITTER
  • CONTACT US
Thursday, July 7, 2022
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
    • BIOENGINEERING
    • SCIENCE NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • FORUM
    • INSTAGRAM
    • TWITTER
  • CONTACT US
  • HOME
  • NEWS
    • BIOENGINEERING
    • SCIENCE NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • FORUM
    • INSTAGRAM
    • TWITTER
  • CONTACT US
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News

Humans explaining self-explaining machines

Bioengineer by Bioengineer
June 23, 2022
in Science News
0
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

The processes behind machine learning are incomprehensible for many. At the Collaborative Research Centre/Transregio (CRC/TRR) 318 “Constructing Explainability” at Bielefeld University and Paderborn University, researchers are working to develop ways to design explanatory processes and to enable users to take control of explanations from artificial intelligence (AI). At the first international conference organised by TRR 318, the focus will be on social-science research on explanatory AI. The conference is called ‘Explaining Machines’ and will take place from June 23–24 in the CITEC Building of Bielefeld University. The conference will be held in English.

Dr. Elena Esposito (left) and Dr. Tobias Matzner (right)

Credit: Photo left: Bielefeld University/M. Adamski; Photo right: Paderborn University

The processes behind machine learning are incomprehensible for many. At the Collaborative Research Centre/Transregio (CRC/TRR) 318 “Constructing Explainability” at Bielefeld University and Paderborn University, researchers are working to develop ways to design explanatory processes and to enable users to take control of explanations from artificial intelligence (AI). At the first international conference organised by TRR 318, the focus will be on social-science research on explanatory AI. The conference is called ‘Explaining Machines’ and will take place from June 23–24 in the CITEC Building of Bielefeld University. The conference will be held in English.

Currently, a key question in AI research is how to arrive at comprehensible explanations of underlying machine processes: Should humans be able to explain how machines work, or should machines learn to explain themselves?

‘The double-meaning of the name of our conference, “Explaining Machines,” expresses these various possibilities: machines explaining themselves, humans explaining machines – or maybe both at the same time,’ says Professor Dr. Elena Esposito. The Bielefeld sociologist is heading a subproject at TRR 318 and is organising the conference together with her colleague Professor Dr. Tobias Matzner from Paderborn University. Dr. Matzer is a media studies researcher who is also heading a Transregio subproject. ‘If explanations from machines are to have an impact socially and politically, it’s not enough that explanations are comprehensible to computer scientists,’ says Matzner. ‘Different socially situated people must be included in explanatory processes – from doctors to retirees and schoolchildren.’

The technical and social challenges of AI projects
The organising team emphasises the interdisciplinary focus of the conference. ‘It’s not enough to develop AI systems solely with the expertise of a single discipline. The computer scientists who design the machines must work together with social scientists who study how humans and AI interact and under what conditions this interaction takes place,’ explains Esposito. ‘Today more than ever, the challenge of AI projects is both technical and social. With this conference, we hope to encourage the inclusion of perspectives and insights from the social sciences in the debates surrounding explanatory machines.’

Previously, the ‘explanability’ of artificial intelligence was largely the domain of computer scientists. ‘In this research approach, the main view is that explainability and comprehension arise from transparency – that is, having as much information available as possible. An alternative view to this is that of co-construction,’ says Professor Dr. Katharina Rohlfing, a linguist at Paderborn University and the spokesperson of Transregio. ‘In our research, we do not consider humans to be passive partners who simply receive explanations. Instead, explanations emerge at the interface between the explainer and the explainee. Both actively shape the process of explanation and work towards achieving agreement on common ideas and conceptions. Cross-disciplinary collaboration is therefore essential to the study of explainability.’

Conference talks from media, philosophy, law, and sociology
The conference includes 10 short talks. Among the international, renowned guests in attendance will be Professor Dr. Frank Pasquale, an expert on legal aspects of artificial intelligence, Professor Dr. Mireille Hildebrandt, a jurist and philosopher, and Dr. David Weinberger, a philosopher of the Internet, along with sociologist Dr. Dominique Cardon, legal scholar Dr. Antoinette Rouvroy and media researcher Dr. Bernhard Rieder. Following the keynote talks, participants will have the opportunity to discuss the conference research articles directly with the presenters. These articles will be sent to conference attendees by email after registering for the event.

The conference ‘Explaining Macines’ is Transregio 318’s first major scientific event. Information on the program can be found here. In the next three funding years, further conferences are planned that will focus on the concept of explainability from the perspectives of different scientific disciplines.

Members of the press are welcome to report on the conference: registration is required in advance by sending an email to [email protected] The conference organizers and the Transregio spokesperson will be available during the conference to answer any questions from the press.

Collaborative Research Centre/Transregio (TRR) 318
The strongly interdisciplinary research program entitled ‘Constructing Explainability’ goes beyond the question of algorithmic decision-making as the basis for the explainability of AI with an approach that requires the active participation of humans with social-technical systems. The goal is to enhance human-machine interaction by focusing on the comprehension of algorithms and examining this as the product of a multimodal explanatory process. The German Research Foundation (DFG) is providing approximately 14 million Euro in funding for this project through July 2025.



Share12Tweet7Share2ShareShareShare1

Related Posts

Robot deliver packages

Bees’ ‘waggle dance’ may revolutionize how robots talk to each other in disaster zones

July 7, 2022
Killing resistant prostate cancer with iron

Killing resistant prostate cancer with iron

July 7, 2022

Less sex during menopause transition not linked to sexual pain

July 7, 2022

Climate factors predict future mosquito activity

July 6, 2022

POPULAR NEWS

  • blank

    Telescopic contact lenses

    40 shares
    Share 16 Tweet 10
  • Oregon State University research finds evidence to suggest Pacific whiting skin has anti-aging properties that prevent wrinkles

    38 shares
    Share 15 Tweet 10
  • The pair of Orcas deterring Great White Sharks – by ripping open their torsos for livers

    37 shares
    Share 15 Tweet 9
  • Emerging Omicron subvariants BA.2.12.1, BA.4 and BA.5 are inhibited less efficiently by antibodies

    37 shares
    Share 15 Tweet 9

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Tags

VaccineUrbanizationUniversity of WashingtonVirusWeaponryVehiclesZoology/Veterinary ScienceVaccinesUrogenital SystemVirologyViolence/CriminalsWeather/Storms

Recent Posts

  • Bees’ ‘waggle dance’ may revolutionize how robots talk to each other in disaster zones
  • Killing resistant prostate cancer with iron
  • Less sex during menopause transition not linked to sexual pain
  • Climate factors predict future mosquito activity
  • Contact Us

© 2019 Bioengineer.org - Biotechnology news by Science Magazine - Scienmag.

No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

© 2019 Bioengineer.org - Biotechnology news by Science Magazine - Scienmag.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
Posting....