• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
Sunday, April 19, 2026
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News

Empathetic machines favored by skeptics but might creep out believers

Bioengineer by Bioengineer
October 31, 2018
in Science News
Reading Time: 3 mins read
0
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

UNIVERSITY PARK, Pa. — Most people would appreciate a chatbot that offers sympathetic or empathetic responses, according to a team of researchers, but they added that reaction may rely on how comfortable the person is with the idea of a feeling machine.

In a study, the researchers reported that people preferred receiving sympathetic and empathetic responses from a chatbot — a machine programmed to simulate a conversation — than receiving a response from a machine without emotions, said S. Shyam Sundar, James P. Jimirro Professor of Media Effects and co-director of the Media Effects Research Laboratory. People express sympathy when they feel compassion for a person, whereas they express empathy when they are actually feeling the same emotions of the other person, said Sundar.

As healthcare providers look for ways to cut costs and improve service, he added these findings could help developers create conversational technologies that encourage people to share information about their physical and mental health states, for example.

"Increasingly, as we have more and more chatbots and more AI-driven conversational agents in our midst," said Sundar. "And, as more people begin to turn to their smart speaker or chatbot on a health forum for advice, or for social and emotional support, the question becomes: To what extent should these chatbots and smart speakers express human-like emotions?"

While machines today cannot truly feel either sympathy or empathy, developers could program these cues into current chatbot and voice assistant technology, according to the researchers who report their findings in the current issue of Cyberpsychology, Behavior & Social Networking.

However, chatbots may become too personal for some people, said Bingjie Liu, a doctoral candidate in mass communications, who worked with Sundar on the study. She said that study participants who were leery of conscious machines indicated they were impressed by the chatbots that were programmed to deliver statements of sympathy and empathy.

"The majority of people in our sample did not really believe in machine emotion, so, in our interpretation, they took those expressions of empathy and sympathy as courtesies," said Liu. "When we looked at people who have different beliefs, however, we found that people who think it's possible that machines could have emotions had negative reactions to these expressions of sympathy and empathy from the chatbots."

The researchers recruited 88 volunteers from a university and Amazon Mechanical Turk, an online task platform. The volunteers were asked to interact with one of four different online health service chatbots programmed to deliver responses specific to one of four conditions set up by the researchers: sympathy, two different types of empathy — cognitive empathy and affective empathy — or, an advice-only control condition.

In the sympathetic version, the chatbot responded with a statement, such as, "I am sorry to hear that." The chatbot programmed for cognitive empathy, which acknowledged the user's feelings, might say, "That issue can be quite disturbing." A chatbot that expressed affective empathy might respond with a sentence that showed the machine understood how and why a user felt the way they did, such as, "I understand your anxiety about the situation."

The researchers said that affective empathy and sympathy worked the best.

"We found that the cognitive empathy — where the response is somewhat detached and it's approaching the problem from a thoughtful, but almost antiseptic way — did not quite work," said Sundar." Of course, chatbots and robots do that quite well, but that is also the stereotype of machines. And it doesn't seem to be as effective. What seems to work best is affective empathy, or an expression of sympathy."

In a previous study, the researchers asked participants to just read the script of the conversation between a human subject and a machine. They found similar effects on the use of sympathy and empathy in messages.

The researchers said that future research could examine how the sympathetic and empathetic interactions work for different issues beyond health and sexuality, as well as investigate how people feel if humanlike machines and robots deliver those types of responses.

"We want to see if this is a consistent pattern in how humans react to machine emotions," said Liu.

###

Media Contact

Matt Swayne
[email protected]
@penn_state

http://live.psu.edu

https://news.psu.edu/story/544732/2018/10/31/research/empathetic-machines-favored-skeptics-might-creep-out-believers

Related Journal Article

http://dx.doi.org/10.1089/cyber.2018.0110

Share13Tweet8Share2ShareShareShare2

Related Posts

Elevated Fetal Catecholamine Metabolites Signal Growth Restriction

Elevated Fetal Catecholamine Metabolites Signal Growth Restriction

April 19, 2026

Comorbidities Shape Hip Fracture Surgery Outcomes

April 19, 2026

Wastewater Detects Drug-Resistant Candidozyma auris Emergence

April 18, 2026

Metabolically Healthy Obesity Linked to 20-Year Heart Risk

April 18, 2026
Please login to join discussion

POPULAR NEWS

  • Scientists Investigate Possible Connection Between COVID-19 and Increased Lung Cancer Risk

    62 shares
    Share 25 Tweet 16
  • NSF funds machine-learning research at UNO and UNL to study energy requirements of walking in older adults

    101 shares
    Share 40 Tweet 25
  • Boosting Breast Cancer Risk Prediction with Genetics

    47 shares
    Share 19 Tweet 12
  • Self-Oscillating Electroactive Nanocomposites Boost Heat Pumps

    41 shares
    Share 16 Tweet 10

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

Elevated Fetal Catecholamine Metabolites Signal Growth Restriction

Comorbidities Shape Hip Fracture Surgery Outcomes

Wastewater Detects Drug-Resistant Candidozyma auris Emergence

Subscribe to Blog via Email

Success! An email was just sent to confirm your subscription. Please find the email now and click 'Confirm' to start subscribing.

Join 79 other subscribers
  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.