• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
Monday, June 23, 2025
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News Technology

Remote Detection of Dizziness and Balance Disorders: AI’s New Role in Health Monitoring

Bioengineer by Bioengineer
June 4, 2025
in Technology
Reading Time: 5 mins read
0
ADVERTISEMENT
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

Eye-tracking Tool

Artificial intelligence (AI) continues to revolutionize various fields, and healthcare is no exception. This transformative technology is increasingly making significant inroads in medical diagnostics, particularly in the interpretation of complex medical images. This approach helps clinicians better assess disease severity, formulate treatment plans, and monitor patient progress over time. Despite the rapid advancements in AI, many of the existing models are predominantly reliant on static datasets, thereby limiting their effectiveness in real-time applications. This reliance on historical data means that the technology cannot adapt quickly to new inputs, which is a critical requirement in urgent medical scenarios where timely decisions can save lives.

In a groundbreaking development, researchers from Florida Atlantic University (FAU) have introduced an innovative deep learning model designed to leverage real-time data for diagnosing nystagmus. Nystagmus is characterized by involuntary, oscillatory eye movements, often linked to serious vestibular or neurological conditions. This proof-of-concept project not only sets a new benchmark for the identification and management of nystagmus but also highlights the potential of AI in telehealth applications, particularly in rural and underserved communities.

Traditional diagnostic methods for nystagmus, such as videonystagmography (VNG) and electronystagmography, have long been considered the gold standard. However, these established diagnostic tools are not without their limitations. VNG equipment can cost upwards of $100,000, making it prohibitively expensive for many healthcare providers. Furthermore, the bulky and intricate setups can cause discomfort for patients, thus dissuading them from seeking timely care. Recognizing these hurdles, the FAU team has developed an AI-driven system offering a more accessible, patient-centric alternative that facilitates swift, reliable screenings for balance disorders and irregular eye movements using technology many people already possess: their smartphones.

.adsslot_f6Y8SoOitg{ width:728px !important; height:90px !important; }
@media (max-width:1199px) { .adsslot_f6Y8SoOitg{ width:468px !important; height:60px !important; } }
@media (max-width:767px) { .adsslot_f6Y8SoOitg{ width:320px !important; height:50px !important; } }

ADVERTISEMENT

The AI technology enables patients to record their eye movements with a smartphone camera, upload the video securely to a cloud-based platform, and then receive an analytical report from specialists in vestibular and balance disorders—all from the comfort of home. This innovation demonstrates a significant leap in reducing barriers to care, especially for patients in remote or rural areas where specialized medical resources are scarce.

The deep learning framework underpinning this system employs real-time tracking of facial landmarks to analyze eye movements accurately. By leveraging advanced algorithms, the AI can automatically identify and assess 468 facial landmarks—crucial for measuring slow-phase velocity, duration, and direction of eye movements. This data can be visualized in user-friendly graphs and reports, simplifying the consultation process for audiologists and clinicians during virtual appointments. Such features improve the interpretability of the findings, thereby streamlining decision-making.

A pilot study involving 20 participants has shown promising results, indicating a strong correlation between the AI system’s assessments and those obtained through conventional medical devices. This early success underscores not only the model’s accuracy but also its potential clinical reliability, serving as a foundation for future research and refinement of the technology in more diverse patient populations.

“Our AI model offers a promising tool that can either supplement or, in some cases, replace conventional diagnostic methods, particularly beneficial in telehealth settings where patient access to specialized care may be limited,” says Ali Danesh, Ph.D., the principal investigator of this study. He emphasizes that integrating deep learning algorithms with cloud computing and telemedicine frameworks can lead to a medical environment that is more flexible, efficient, and accessible to those who need it most—specifically low-income or rural populations.

The research team meticulously trained their AI algorithm on over 15,000 video frames, employing a structured split of data to enhance its adaptability to diverse patient scenarios. This rigorous method not only helps fortify the model’s robustness but also ensures that its assessments remain reliable across different demographics. Moreover, the algorithm includes intelligent filtering capabilities that eliminate visual artifacts like eye blinks, enhancing measurement accuracy and consistency.

Beyond its diagnostic capabilities, the AI system is designed to enhance clinical workflow efficiency. Physicians and audiologists can access the AI-generated reports via telemedicine platforms, directly comparing these insights with electronic health records to create tailored treatment plans. Patients benefit by significantly reducing travel expenses and time, gaining the ability to conduct follow-up assessments by merely uploading new videos from their homes. This continuous, real-time monitoring enables clinicians to observe progression in disorders over time and to adjust care strategies accordingly.

In parallel to this groundbreaking work, FAU researchers are also investigating the potential of a wearable headset endowed with deep learning functions to detect nystagmus in real-time environments. Initial tests in controlled settings have yielded optimistic results, indicating the feasibility of this approach. However, the research team acknowledges that challenges remain, particularly concerning sensor noise and variability among users.

As Harshal Sanghvi, Ph.D., the first author and postdoctoral fellow at FAU, states, “While still in its nascent stages, our technology holds transformative potential for patients suffering from vestibular and neurological disorders. By offering non-invasive, real-time analytical capabilities, our platform could see application in various settings ranging from clinics, emergency rooms, and audiology centers to home care environments.”

This interdisciplinary initiative at FAU encompasses collaborations between various colleges, showcasing the university’s collective effort to refine the AI model, escalate testing across broader patient demographics, and strive toward achieving FDA approval for wider clinical adoption.

The inclusion of AI-driven diagnostic innovations in healthcare emphasizes a shift toward patient-centered care. This is increasingly relevant as telemedicine becomes an integral component of healthcare delivery. Such advancements not only facilitate early detection of conditions but also streamline specialist referral processes, thus alleviating the burden on healthcare providers. Ultimately, these technologies promise improved outcomes for patients, irrespective of their geographical location.

As AI continues to evolve, its implementation in the medical field indicates that a future where accurate, timely diagnostics are universally accessible may not be far off. Increasingly, healthcare organizations are compelled to adapt to integrating these sophisticated technologies that promise to reshape conventional practices. AI’s journey into healthcare symbolizes not just technological advancement, but also a redefined landscape that prioritizes patient welfare and accessibility above all.

In this emergence of AI within the medical realm, it becomes evident that institutions like FAU are pioneering crucial innovations that could potentially enhance the quality of care for vestibular and neurological disorders, marking a significant landmark in AI’s capacity to influence lives positively.

Subject of Research: People
Article Title: Artificial Intelligence-Driven Telehealth Framework for Detecting Nystagmus
News Publication Date: 13-May-2025
Web References: www.fau.edu
References: Cureus, DOI: 10.7759/cureus.84036
Image Credits: Florida Atlantic University

Keywords

Health and medicine
Neurological disorders
Telehealth
Deep learning
Nystagmus
AI in healthcare
Cost-effective diagnostics
Patient-centered care
Real-time monitoring
Cloud computing
Telemedicine
Interdisciplinary research

Tags: advancements in telemedicine technologyAI in healthcare monitoringartificial intelligence in medical imagingbalance disorders diagnosticschallenges in medical AI modelsdeep learning for health diagnosticshealthcare accessibility through AI solutionsinnovative approaches to nystagmus managementnystagmus identification techniquesreal-time data analysis in medicineremote detection of dizzinesstelehealth applications for rural communities

Share12Tweet8Share2ShareShareShare2

Related Posts

AI Chatbot Protections Fall Short in Curbing Health Misinformation

AI Chatbot Protections Fall Short in Curbing Health Misinformation

June 23, 2025
Eminé Fidan, assistant professor, UTIA Department of Biosystems Engineering and Soil Science

New USDA Grant Launches Study on Hurricane Helene’s Flood Effects on Agricultural Land

June 23, 2025

Effects of Small Size in Under-600g Neonates

June 23, 2025

Unlocking the Future: A Deep Dive into Quantum Bootcamp

June 23, 2025

POPULAR NEWS

  • Green brake lights in the front could reduce accidents

    Study from TU Graz Reveals Front Brake Lights Could Drastically Diminish Road Accident Rates

    161 shares
    Share 64 Tweet 40
  • Pancreatic Cancer Vaccines Eradicate Disease in Preclinical Studies

    72 shares
    Share 29 Tweet 18
  • New Study Uncovers Unexpected Side Effects of High-Dose Radiation Therapy

    77 shares
    Share 31 Tweet 19
  • How Scientists Unraveled the Mystery Behind the Gigantic Size of Extinct Ground Sloths—and What Led to Their Demise

    65 shares
    Share 26 Tweet 16

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

New PET Tracer Detects a Wide Range of Invasive Mold Infections Driving Life-Threatening Illnesses in Cancer and Transplant Patients

AI Chatbot Protections Fall Short in Curbing Health Misinformation

Quantifying the Benefits and Trade-Offs of Planting Corn After Soybeans: New Study Reveals Insights

  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.