• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • CONTACT US
Monday, December 11, 2023
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • CONTACT US
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • CONTACT US
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News

Do I look mad? Reading facial cues with the touch-screen generation

Bioengineer by Bioengineer
May 7, 2020
in Science News
Reading Time: 3 mins read
0
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

New UCLA study may give parents some peace of mind about their kids’ screen time

IMAGE

Credit: Stephen Nowicki

Are today’s children, who grew up with mobile technology from birth, worse at reading emotions and picking up cues from people’s faces than children who didn’t grow up with tablets and smartphones? A new UCLA psychology study suggests today’s kids are all right.

Infancy and early childhood are critical developmental phases during which children learn to interpret important non-verbal cues such as facial expressions, tone of voice and gestures. Traditionally, this happens through direct face-to-face communication. But with the ubiquitous use of tablets and other devices today — among toddlers, as well as their caregivers — the psychologists wanted to know: Have younger children missed the opportunity to understand these cues?

The study tested the ability of more than 50 sixth graders in 2017, and more than 50 sixth graders in 2012 — both male and female, from the same Southern California public school — to correctly identify emotions in photographs and videos. Most children from the sixth-grade class of 2012 were born in 2001, while the first iPhone came out in 2007, for example, and the first iPad in 2010 — a time when the sixth graders from the 2017 class were infants and toddlers.

The psychologists found that the 2017 sixth graders scored 40% higher than the 2012 class at correctly identifying emotions in photographs and made significantly fewer errors than the 2012 students. In addition, the 2017 students were better at identifying the emotions in a series of videos, but only slightly better, a difference the researchers said is not statistically significant. The psychologists did not look at face-to-face communication.

The study is published in the journal Cyberpsychology, Behavior, and Social Networking.

“At a time when so many people are communicating through screens, I hope our findings give parents some peace of mind that kids seem to be able to learn to read social cues in photos,” said lead author Yalda T. Uhls, a UCLA adjunct assistant professor of psychology and founder and executive director of the UCLA-based Center for Scholars and Storytellers.

In today’s world, young people use photos and, increasingly, video to communicate. In 2018, for instance, 69% of teens reported they used Snapchat and 72% used Instagram, both of which incorporate pictures and text messages, Uhls said.

A 2017 study with 500 participants found that nearly half of children between the ages of 6 and 12 regularly used a social media app or website, with 29% of those aged 6 to 8 reporting they used Snapchat. Another study, from 2016, found that 50% of children had a social media account before age 12, with 11% getting their account before age 10.

Uhls noted that even text-based communication can convey emotion through capitalization, emoticons and repetition.

“Perhaps our 2017 participants had more opportunities to see, communicate and learn nonverbal emotion expressed in photographs of faces than those from 2012 because of the time spent taking and reviewing photos of themselves and others,” she said.

Uhls strongly recommends that families have face-to-face conversations around the dinner table and at other times of the day. She also encourages parents to put their devices away when talking with other people, especially their children.

She noted that while the 2017 participants improved in their ability to read emotional cues in photos, she is not we certain whether this ability transfers to their ability to assess emotions in person; she thinks it may.

“With so many of our kids are on screens so frequently, it is important to know that good things can come from their interactions with photos,” Uhls said. “I would expect with the recent increase in video communication, they may be now learning these cues from video chat too.”

She said that by evaluating the nuances of screen time, researchers can learn which practices have educational value and which do not.

“Technology is always evolving, and I expect that researchers will seek to understand how increased exposure to pictures, videos, live chats, games, virtual reality and other emerging platforms for communication impact our youth,” Uhls said.

Even 18-month-old babies can learn from video chat, she said, citing other research, while another study found that time on screens did not seem to impact kids’ social skills.

###

Media Contact
Stuart Wolpert
[email protected]

Tags: Audiovisual MediaBehaviorComputer ScienceDepression/AngerMass MediaParenting/Child Care/FamilyPersonality/AttitudeSocial/Behavioral Science
Share12Tweet8Share2ShareShareShare2

Related Posts

RSOM images of the skin of a healthy volunteer (left), a patient with diabetes (center) and a patient with neuropathy (right).

Diabetes: examining microvasculature with AI and an optoacoustic skin scanner

December 11, 2023
Historical bee collection

Bee species in Wake County, N.C.: are they missing or just hard to find?

December 11, 2023

In 7 months, online Target ID course from AI drug discovery company Insilico Medicine surpasses 2,500 users

December 11, 2023

Portable, non-invasive, mind-reading AI turns thoughts into text

December 11, 2023
Please login to join discussion

POPULAR NEWS

  • Figure 1

    Understanding rapid tendon regeneration in newts may one day help human athletes

    86 shares
    Share 34 Tweet 22
  • Photonic chip that ‘fits together like Lego’ opens door to semiconductor industry

    36 shares
    Share 14 Tweet 9
  • Study finds increasingly popular oral nicotine pouches do little to curb smokers’ cravings

    35 shares
    Share 14 Tweet 9
  • One of the largest magnetic storms in history quantified: Aurorae covered much of the night sky from the Tropics to the Polar Regions

    35 shares
    Share 14 Tweet 9

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

Diabetes: examining microvasculature with AI and an optoacoustic skin scanner

Bee species in Wake County, N.C.: are they missing or just hard to find?

In 7 months, online Target ID course from AI drug discovery company Insilico Medicine surpasses 2,500 users

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 58 other subscribers
  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In