In a new proof-of-concept study led by Dr. Mark Walker at the University of Ottawa’s Faculty of Medicine, researchers are pioneering the use of a unique Artificial Intelligence-based deep learning model as an assistive tool for the rapid and accurate reading of ultrasound images.
Credit: Faculty of Medicine, University of Ottawa
In a new proof-of-concept study led by Dr. Mark Walker at the University of Ottawa’s Faculty of Medicine, researchers are pioneering the use of a unique Artificial Intelligence-based deep learning model as an assistive tool for the rapid and accurate reading of ultrasound images.
The goal of the team’s study was to demonstrate the potential for deep-learning architecture to support early and reliable identification of cystic hygroma from first trimester ultrasound scans. Cystic hygroma is an embryonic condition that causes the lymphatic vascular system to develop abnormally. It’s a rare and potentially life-threatening disorder that leads to fluid swelling around the head and neck.
The birth defect can typically be easily diagnosed prenatally during an ultrasound appointment, but Dr. Walker – co-founder of the OMNI Research Group (Obstetrics, Maternal and Newborn Investigations) at The Ottawa Hospital – and his research group wanted to test how well AI-driven pattern recognition could do the job.
“What we demonstrated was in the field of ultrasound we’re able to use the same tools for image classification and identification with a high sensitivity and specificity,” says Dr. Walker, who believes their approach might be applied to other fetal anomalies generally identified by ultrasonography.
The findings were recently published in PLOS ONE, a peer-reviewed open access journal. Read the full details from the Faculty of Medicine here.
Journal
PLoS ONE
DOI
10.1371/journal.pone.0269323
Method of Research
Imaging analysis
Subject of Research
People
Article Title
Using deep-learning in fetal ultrasound analysis for diagnosis of cystic hygroma in the first trimester
Article Publication Date
22-Jun-2022