• HOME
  • NEWS
    • BIOENGINEERING
    • SCIENCE NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • FORUM
    • INSTAGRAM
    • TWITTER
  • CONTACT US
Wednesday, May 18, 2022
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
    • BIOENGINEERING
    • SCIENCE NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • FORUM
    • INSTAGRAM
    • TWITTER
  • CONTACT US
  • HOME
  • NEWS
    • BIOENGINEERING
    • SCIENCE NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • FORUM
    • INSTAGRAM
    • TWITTER
  • CONTACT US
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News

Simulated human eye movement aims to train metaverse platforms

Bioengineer by Bioengineer
March 7, 2022
in Science News
0
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

DURHAM, N.C. – Computer engineers at Duke University have developed virtual eyes that simulate how humans look at the world accurately enough for companies to train virtual reality and augmented reality programs. Called EyeSyn for short, the program will help developers create applications for the rapidly expanding metaverse while protecting user data.

Video Game Eyes Gif

Credit: Maria Gorlatova, Duke University

DURHAM, N.C. – Computer engineers at Duke University have developed virtual eyes that simulate how humans look at the world accurately enough for companies to train virtual reality and augmented reality programs. Called EyeSyn for short, the program will help developers create applications for the rapidly expanding metaverse while protecting user data.

The results have been accepted and will be presented at the International Conference on Information Processing in Sensor Networks (IPSN), May 4-6, 2022, a leading annual forum on research in networked sensing and control.

“If you’re interested in detecting whether a person is reading a comic book or advanced literature by looking at their eyes alone, you can do that,” said Maria Gorlatova, the Nortel Networks Assistant Professor of Electrical and Computer Engineering at Duke.

“But training that kind of algorithm requires data from hundreds of people wearing headsets for hours at a time,” Gorlatova added. “We wanted to develop software that not only reduces the privacy concerns that come with gathering this sort of data, but also allows smaller companies who don’t have those levels of resources to get into the metaverse game.”

The poetic insight describing eyes as the windows to the soul has been repeated since at least Biblical times for good reason: The tiny movements of how our eyes move and pupils dilate provide a surprising amount of information. Human eyes can reveal if we’re bored or excited, where concentration is focused, whether or not we’re expert or novice at a given task, or even if we’re fluent in a specific language.

“Where you’re prioritizing your vision says a lot about you as a person, too,” Gorlatova said. “It can inadvertently reveal sexual and racial biases, interests that we don’t want others to know about, and information that we may not even know about ourselves.”

Eye movement data is invaluable to companies building platforms and software in the metaverse. For example, reading a user’s eyes allows developers to tailor content to engagement responses or reduce resolution in their peripheral vision to save computational power.

With this wide range of complexity, creating virtual eyes that mimic how an average human responds to a wide variety of stimuli sounds like a tall task. To climb the mountain, Gorlatova and her team — including former postdoctoral associate Guohao Lan, who is now an assistant professor at the Delft University of Technology in the Netherlands, and current PhD student Tim Scargill — dove into the cognitive science literature that explores how humans see the world and process visual information.

For example, when a person is watching someone talk, their eyes alternate between the person’s eyes, nose and mouth for various amounts of time. When developing EyeSyn, the researchers created a model that extracts where those features are on a speaker and programmed their virtual eyes to statistically emulate the time spent focusing on each region.

“If you give EyeSyn a lot of different inputs and run it enough times, you’ll create a data set of synthetic eye movements that is large enough to train a (machine learning) classifier for a new program,” Gorlatova said.

To test the accuracy of their synthetic eyes, the researchers turned to publicly available data. They first had the eyes “watch” videos of Dr. Anthony Fauci addressing the media during press conferences and compared it to data from the eye movements of actual viewers. They also compared a virtual dataset of their synthetic eyes looking at art with actual datasets collected from people browsing a virtual art museum. The results showed that EyeSyn was able to closely match the distinct patterns of actual gaze signals and simulate the different ways different people’s eyes react.

According to Gorlatova, this level of performance is good enough for companies to use it as a baseline to train new metaverse platforms and software. With a basic level of competency, commercial software can then achieve even better results by personalizing its algorithms after interacting with specific users.

“The synthetic data alone isn’t perfect, but it’s a good starting point,” Gorlatova said. “Smaller companies can use it rather than spending the time and money of trying to build their own real-world datasets (with human subjects). And because the personalization of the algorithms can be done on local systems, people don’t have to worry about their private eye movement data becoming part of a large database.”

This research was funded by the National Science Foundation (CSR-1903136, CNS-1908051, IIS-2046072) and an IBM Faculty Award.

CITATION: “EyeSyn: Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition,” Guohao Lan, Timothy Scargill, Maria Gorlatova. ACM/IEEE IPSN 2022.

# # #



Method of Research

Experimental study

Subject of Research

Not applicable

Article Title

EyeSyn: Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition

Article Publication Date

4-Mar-2022

Share12Tweet7Share2ShareShareShare1

Related Posts

Schematic illustration of common CLC superstructures and opposite-chirality-coexisted superstructures.

Pancharatnam–Berry phase reversal via opposite-chirality-coexisted superstructures

May 18, 2022
Figure 1

Reliable diagnostics at the tip of your finger

May 18, 2022

Seafloor animal cued to settle, transformed by a bacterial compound

May 18, 2022

Tooth unlocks mystery of Denisovans in Asia

May 18, 2022

POPULAR NEWS

  • Weybourne Atmospheric Observatory

    Breakthrough in estimating fossil fuel CO2 emissions

    46 shares
    Share 18 Tweet 12
  • Hidden benefit: Facemasks may reduce severity of COVID-19 and pressure on health systems, researchers find

    44 shares
    Share 18 Tweet 11
  • Discovery of the one-way superconductor, thought to be impossible

    43 shares
    Share 17 Tweet 11
  • Sweet discovery could drive down inflammation, cancers and viruses

    42 shares
    Share 17 Tweet 11

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Tags

Weather/StormsUrbanizationVaccinesVirusWeaponryUrogenital SystemVaccineZoology/Veterinary ScienceViolence/CriminalsVirologyUniversity of WashingtonVehicles

Recent Posts

  • Pancharatnam–Berry phase reversal via opposite-chirality-coexisted superstructures
  • Reliable diagnostics at the tip of your finger
  • Seafloor animal cued to settle, transformed by a bacterial compound
  • Tooth unlocks mystery of Denisovans in Asia
  • Contact Us

© 2019 Bioengineer.org - Biotechnology news by Science Magazine - Scienmag.

No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

© 2019 Bioengineer.org - Biotechnology news by Science Magazine - Scienmag.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
Posting....