School-age children could one day take a class on the moon — with the help of virtual reality, that is. With a $1 million National Science Foundation grant, a multi-university team of researchers will work to expand the possibilities of VR-based education over wireless networks.
Credit: Kelby Hochreither/Penn State
School-age children could one day take a class on the moon — with the help of virtual reality, that is. With a $1 million National Science Foundation grant, a multi-university team of researchers will work to expand the possibilities of VR-based education over wireless networks.
Led by Bin Li, Penn State associate professor of electrical engineering, the researchers will develop and implement a virtual reality system to create a personalized, collaborative VR platform for use in the classroom. Students will use a headpiece, called an Oculus Quest, and hand controls while learning in the classroom. Though this particular grant is geared toward immersive learning for students sharing the same classroom, in future studies Li hopes to extend the technology for use in remote learning, where students plug into a lesson from multiple locations.
“The students and teacher will all be able to see the same view, using the VR equipment in the classroom,” Li said. “It is similar to what we have today with screen sharing on Zoom, but it will be three-dimensional and 360 degrees.”
Though VR technology is already available for personal use, it’s not currently possible for a group of people in the same location to all share the same VR experience. To address this problem, researchers will create a wireless networking and computing platform that expands on the technology, focusing on wireless networking algorithms that will enable an immersive experience for multiple users.
“With the VR technology, students can visually see abstract concepts, like cell structure in a biology class, and interact with corresponding 3D models,” Li said. “They also can conduct relatively risky scientific experiments in the 3D environment, like exploding bubbles of hydrogen and oxygen in a chemistry class.”
Led by co-principal investigator Atilla Eryilmaz, professor of electrical and computer engineering, researchers from Ohio State University will assist with simultaneous support of multiple users in one central classroom. Researchers from the University of Minnesota Twin Cities will work on system implementation, led by co-PI Feng Qian, associate professor of computer science and engineering. Finally, researchers at the University of Illinois at Urbana Champaign, led by co-PI Rayadurgam Srikant, Fredric G. and Elizabeth H. Nearing Endowed Professor of Electrical and Computer Engineering, will address all aspects of human movements, such as studying head movements, to further enhance the learning experience.
The new VR algorithms will seek to reduce latency, or delays, that are commonplace in two-dimensional learning tools like Zoom, according to Li. They also will focus on achieving consistent transmission and resolution. Video conferencing uses 25 frames per second, while VR technology uses at least 60. The network also requires four to six times more bandwidth than video conferencing, another challenge Li and his team will address.
“There are so many possibilities for this new technology,” Li said. “Students could see a three-dimensional view of the human body when learning anatomy, travel underwater when learning about marine animals or visit art museums or monuments when learning about art or history.”
Li said this is particularly true for rural and underdeveloped school districts that are far away from cities or that do not have the funds to take students on field trips.
“I am excited to help develop this technology in the classroom and promote experiential learning, particularly in situations where traveling for field trips is not possible or is dangerous,” he said.