• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
Wednesday, January 14, 2026
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News Health

Robot therapists need rules

Bioengineer by Bioengineer
May 15, 2019
in Health
Reading Time: 4 mins read
0
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

Use of embodied AI in psychiatry poses ethical questions

Interactions with artificial intelligence (AI) will become an increasingly common aspect of our lives. A team at the Technical University of Munich (TUM) has now completed the first study of how “embodied AI” can help treat mental illness. Their conclusion: Important ethical questions of this technology remain unanswered. There is urgent need for action on the part of governments, professional associations and researchers.

Robot dolls that teach autistic children to communicate better, computer-generated avatars that help patients cope with hallucinations, and virtual chats offering support with depression: Numerous initiatives using embodied AI for improving mental health already exist. These applications are referred to as embodied because they involve interactions between individuals and an artificial agent, resulting in entirely new dynamics.

The use of AI in psychotherapy is not new as such. Back in the 1960s, the first chatbots created the illusion of a psychotherapy session. In reality, however, this was little more than a gimmick. With today’s advanced algorithms and higher computing power, much more is possible. “The algorithms behind these new applications have been trained with enormous data sets and can produce genuine therapeutic statements,” explains Alena Buyx, Professor of Ethics in Medicine and Health Technologies at TUM. With Dr. Amelia Fiske and Peter Henningsen, Professor of Psychosomatic Medicine and psychotherapy, she has conducted the first systematic survey of embodied AI applications for mental health and drawn conclusions on the related opportunities and challenges.

Access to treatment for more people

The new applications have enormous potential. They can make treatment accessible to more people because they are not limited to specific times or locations. In addition, some patients find it easier to interact with AI than with a human being. But there are risks, too. “AI methods cannot and must not be used as a cheaper substitute for treatment by human doctors,” says Amelia Fiske.

“Although embodied AI has arrived in the clinical world, there are still very few recommendations from medical associations on how to deal with this issue. Urgent action is needed, however, if the benefits of these technologies are to be exploited while avoiding disadvantages and ensuring that reasonable checks are in place. Young doctors should also be exposed to this topic while still at medical school,” says Peter Henningsen, who is the dean of the TUM School of Medicine.

Ethical rules for artificial intelligence still lacking

At present, there are increased efforts to draw up guidelines for AI, including the Ethics Guidelines for Trustworthy AI just issued by the EU. However, Buyx, Fiske and Henningsen also see an urgent need to regulate the use of AI in specialized fields. “Therapeutic AI applications are medical products for which we need appropriate approval processes and ethical guidelines,” says Alena Buyx. “For example, if the programs can recognize whether patients are having suicidal thoughts, then they must follow clear warning protocols, just like therapists do, in case of serious concerns.”

In addition, intensive study is needed into the social effects of embodied AI. “We have very little information on how we as human beings are affected by contact with therapeutic AI,” says Alena Buyx. “For example, through contact with a robot, a child with a disorder on the autism spectrum might only learn how to interact better with robots – but not with people.”

###

Publication:

Fiske A, Henningsen P, Buyx A. Your robot therapist will see you now: Ethical implications of embodied artificial intelligence in psychiatry, psychology, and psychotherapy. Journal of Medical Internet Research (2019). DOI: 10.2196/13216.

More information:

Embodied artificial intelligence is also an important research area at the Munich School of Robotics and Machine Intelligence (MSRM). At this interdisciplinary center, Prof. Alena Buyx works with scientists from such fields as informatics, electrical and computer engineering and mechanical engineering as well as social sciences and humanities on research on such topics as the future of health, work and mobility.

Chair of Ethics in Medicine and Health Technologies:
http://www.get.med.tum.de

Chair of Psychosomatic Medicine and Psychotherapy:
http://www.psychosomatik.mri.tum.de

Munich School of Robotics and Machine Intelligence:
http://www.msrm.tum.de

High-resolution image:

https://mediatum.ub.tum.de/1486120

Contact:

Prof. Dr. med. Alena M. Buyx, M. A. phil., FRSA

Technical University of Munich

Chair of Ethics in Medicine and Health Technologies

Tel: +49 89 4140 4041

[email protected]

Media Contact
Paul Hellmich
[email protected]

Related Journal Article

https://www.tum.de/nc/en/about-tum/news/press-releases/details/article/35442/
http://dx.doi.org/10.2196/13216

Tags: Computer ScienceHealth CareMedical/Scientific EthicsMedicine/HealthMental HealthPolicy/EthicsRobotry/Artificial IntelligenceScience/Health and the Law
Share12Tweet8Share2ShareShareShare2

Related Posts

Discovering New PI3Kα Inhibitors for Colon Cancer

Discovering New PI3Kα Inhibitors for Colon Cancer

January 14, 2026

Cerebrospinal Fluid Flow Changes in Parkinson’s Disease

January 14, 2026

Super-Resolution Ultrasound Reveals Brain Issues in Parkinson’s

January 14, 2026

Pre-Breakfast Hand Bathing Boosts Postoperative Recovery Comfort

January 14, 2026
Please login to join discussion

POPULAR NEWS

  • Enhancing Spiritual Care Education in Nursing Programs

    155 shares
    Share 62 Tweet 39
  • PTSD, Depression, Anxiety in Childhood Cancer Survivors, Parents

    147 shares
    Share 59 Tweet 37
  • Robotic Ureteral Reconstruction: A Novel Approach

    74 shares
    Share 30 Tweet 19
  • Study Reveals Lipid Accumulation in ME/CFS Cells

    52 shares
    Share 21 Tweet 13

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

Discovering New PI3Kα Inhibitors for Colon Cancer

Structural and Functional Differences in Citrus PRR and R Genes

Rising Urban Gaps in Road Freight Emissions

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 71 other subscribers
  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.