• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
Thursday, November 27, 2025
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News Health

The relationship between looking/listening and human emotions

Bioengineer by Bioengineer
June 19, 2020
in Health
Reading Time: 3 mins read
0
IMAGE
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

Revealing the relationship between attentional states and emotions from human pupillary reactions

IMAGE

Credit: COPYRIGHT (C) TOYOHASHI UNIVERSITY OF TECHNOLOGY. ALL RIGHTS RESERVED.

Overview

A research team from the Department of Computer Science and Engineering and the Electronics-Inspired Interdisciplinary Research Institute at Toyohashi University of Technology has indicated that the relationship between attentional states in response to pictures and sounds and the emotions elicited by them may be different in visual perception and auditory perception. This result was obtained by measuring pupillary reactions related to human emotions. It suggests that visual perception elicits emotions in all attentional states, whereas auditory perception elicits emotions only when attention is paid to sounds, thus showing the differences in the relationships between attentional states and emotions in response to visual and auditory stimuli.

Details

In our daily lives, our emotions are often elicited by the information we receive from visual and auditory perception. As such, many studies up until now have investigated human emotional processing using emotional stimuli such as pictures and sounds. However, it was not clear whether such emotional processing differed between visual and auditory perception.

Our research team asked participants in the experiment to perform four tasks to alert them to various attentional states when they were presented with emotionally arousing pictures and sounds, in order to investigate how emotional responses differed between visual and auditory perception. We also compared the pupillary responses obtained by eye movement measurements as a physiological indicator of emotional responses. As a result, visual perception (pictures) elicited emotions during the execution of all tasks, whereas auditory perception (sounds) did so only during the execution of tasks where attention was paid to the sounds. These results suggest that there are differences in the relationship between attentional states and emotional responses to visual and auditory stimuli.

“Traditionally, subjective questionnaires have been the most common method for assessing emotional states. However, in this study, we wanted to extract emotional states while some kind of task was being performed. We therefore focused on pupillary response, which is receiving a lot of attention as one of the biological signals that reflect cognitive states. Although many studies have reported about attentional states during emotional arousal owing to visual and auditory perception, there have been no previous studies comparing these states across senses, and this is the first attempt”, explains the lead author, Satoshi Nakakoga, Ph. D. student.

Besides, Professor Tetsuto Minami, the leader of the research team, said, “There are more opportunities to come into contact with various visual media via smartphones and other devices and to evoke emotions through that visual and auditory information. We will continue investigating about sensory perception that elicits emotions, including the effects of elicited emotions on human behavior.”

Future Outlook

Based on the results of this research, our research team indicates the possibility of a new method of emotion regulation in which the emotional responses elicited by a certain sense are promoted or suppressed by stimuli input from another sense. Ultimately, we hope to establish this new method of emotion regulation to help treat psychiatric disorders such as panic and mood disorders.

###

Reference

Nakakoga, S., Higashi, H., Muramatsu, J., Nakauchi, S., & Minami, T. (2020). Asymmetrical characteristics of emotional responses to pictures and sounds: Evidence from pupillometry. PLoS ONE, 15(4), e0230775.
doi: 10.1371/journal.pone.0230775

Media Contact
Yuko Ito
[email protected]

Related Journal Article

http://dx.doi.org/10.1371/journal.pone.0230775

Tags: BehaviorMental HealthPhysiologyTechnology/Engineering/Computer Science
Share12Tweet8Share2ShareShareShare2

Related Posts

Stigmasterol Activates Nrf2 Pathway, Boosts Antioxidants in Parkinson’s

November 27, 2025

Exploring Grazing Behavior in Adolescents: A Study

November 27, 2025

Luteinizing Hormone’s Role in Blastocyst Transfer Success

November 27, 2025

Prospekta Boosts Cognitive Function in Aging Rats

November 27, 2025
Please login to join discussion

POPULAR NEWS

  • New Research Unveils the Pathway for CEOs to Achieve Social Media Stardom

    New Research Unveils the Pathway for CEOs to Achieve Social Media Stardom

    203 shares
    Share 81 Tweet 51
  • Scientists Uncover Chameleon’s Telephone-Cord-Like Optic Nerves, A Feature Missed by Aristotle and Newton

    119 shares
    Share 48 Tweet 30
  • Neurological Impacts of COVID and MIS-C in Children

    103 shares
    Share 41 Tweet 26
  • Scientists Create Fast, Scalable In Planta Directed Evolution Platform

    101 shares
    Share 40 Tweet 25

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

Stigmasterol Activates Nrf2 Pathway, Boosts Antioxidants in Parkinson’s

Quick Analysis of Indigestible Fiber Using NIR and ICP-OES

Exploring Grazing Behavior in Adolescents: A Study

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 69 other subscribers
  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.