A University of Texas assistant professor is using virtual reality technology to develop a bilingual (English/Spanish) immersive learning environment for students to better understand the field of additive manufacturing.
Credit: UT Arlington
A University of Texas assistant professor is using virtual reality technology to develop a bilingual (English/Spanish) immersive learning environment for students to better understand the field of additive manufacturing.
Shuchisnigdha Deb, assistant professor in the Department of Industrial, Manufacturing, and Systems Engineering (IMSE), will lead the project, “Enhancing Active Learning in Additive Manufacturing Using a Bilingual, Assisted Virtual-Reality Platform.” The National Science Foundation is funding the research with an $837,000 grant. The team includes fellow ISME faculty Emma Yang and Amanda Olsen from the Department of Curriculum and Instruction in the College of Education.
Deb said training students with additive manufacturing technology is important for the development of augmented reality, robotics and other technologies.
“However, this manufacturing area comes with risks and safety concerns,” Deb said. “Workers who directly or indirectly work with these technologies need to be trained with hands-on experience around expensive and sophisticated machines.”
Virtual reality can address most of these problems, Deb said. In a realistic and immersive platform, students can gain confidence as they repeat a task many times without causing any damage to costly machinery.
“Virtual reality can be incorporated as a part of the curriculum, as an instructional delivery system, an instrument to enhance the learning process and a tool for evaluation,” Deb said. “All students, including students with disabilities, can be given access to the cutting-edge learning modules within virtual environments.”
After the development of the learning platform and course modules, a pilot study will collect data in real-time on student interactions with them, including facial expressions and eye movements. Using computer vision and deep learning, the team will develop a model to design real-time assistive functionalities within the virtual platform.
Common applications of virtual reality in education and human-technology interaction can be found in aviation, transportation, construction, manufacturing and health care. As a human factors researcher, Deb wants to use virtual reality to study human behavior and performance under high-risk scenarios and improve the experience.
Paul Componation, professor and IMSE Department chair, said Deb’s project is integral to better understand students’ interaction with sophisticated technologies like additive manufacturing. He noted that learning environments must be built that teach students how to understand and thrive in industries that are using these emerging technologies.