• HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
Friday, June 20, 2025
BIOENGINEER.ORG
No Result
View All Result
  • Login
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
  • HOME
  • NEWS
  • EXPLORE
    • CAREER
      • Companies
      • Jobs
        • Lecturer
        • PhD Studentship
        • Postdoc
        • Research Assistant
    • EVENTS
    • iGEM
      • News
      • Team
    • PHOTOS
    • VIDEO
    • WIKI
  • BLOG
  • COMMUNITY
    • FACEBOOK
    • INSTAGRAM
    • TWITTER
No Result
View All Result
Bioengineer.org
No Result
View All Result
Home NEWS Science News

Driving in the snow is a team effort for AI sensors

Bioengineer by Bioengineer
May 27, 2021
in Science News
Reading Time: 4 mins read
0
ADVERTISEMENT
Share on FacebookShare on TwitterShare on LinkedinShare on RedditShare on Telegram

IMAGE

Credit: Sarah Atkinson/Michigan Tech

Nobody likes driving in a blizzard, including autonomous vehicles. To make self-driving cars safer on snowy roads, engineers look at the problem from the car’s point of view.

A major challenge for fully autonomous vehicles is navigating bad weather. Snow especially confounds crucial sensor data that helps a vehicle gauge depth, find obstacles and keep on the correct side of the yellow line, assuming it is visible. Averaging more than 200 inches of snow every winter, Michigan’s Keweenaw Peninsula is the perfect place to push autonomous vehicle tech to its limits. In two papers presented at SPIE Defense + Commercial Sensing 2021, researchers from Michigan Technological University discuss solutions for snowy driving scenarios that could help bring self-driving options to snowy cities like Chicago, Detroit, Minneapolis and Toronto.

Just like the weather at times, autonomy is not a sunny or snowy yes-no designation. Autonomous vehicles cover a spectrum of levels, from cars already on the market with blind spot warnings or braking assistance, to vehicles that can switch in and out of self-driving modes, to others that can navigate entirely on their own. Major automakers and research universities are still tweaking self-driving technology and algorithms. Occasionally accidents occur, either due to a misjudgment by the car’s artificial intelligence (AI) or a human driver’s misuse of self-driving features.

Humans have sensors, too: our scanning eyes, our sense of balance and movement, and the processing power of our brain help us understand our environment. These seemingly basic inputs allow us to drive in virtually every scenario, even if it is new to us, because human brains are good at generalizing novel experiences. In autonomous vehicles, two cameras mounted on gimbals scan and perceive depth using stereo vision to mimic human vision, while balance and motion can be gauged using an inertial measurement unit. But, computers can only react to scenarios they have encountered before or been programmed to recognize.

Since artificial brains aren’t around yet, task-specific artificial intelligence (AI) algorithms must take the wheel — which means autonomous vehicles must rely on multiple sensors. Fisheye cameras widen the view while other cameras act much like the human eye. Infrared picks up heat signatures. Radar can see through the fog and rain. Light detection and ranging (lidar) pierces through the dark and weaves a neon tapestry of laser beam threads.

“Every sensor has limitations, and every sensor covers another one’s back,” said Nathir Rawashdeh, assistant professor of computing in Michigan Tech’s College of Computing and one of the study’s lead researchers. He works on bringing the sensors’ data together through an AI process called sensor fusion.

“Sensor fusion uses multiple sensors of different modalities to understand a scene,” he said. “You cannot exhaustively program for every detail when the inputs have difficult patterns. That’s why we need AI.”

Rawashdeh’s Michigan Tech collaborators include Nader Abu-Alrub, his doctoral student in electrical and computer engineering, and Jeremy Bos, assistant professor of electrical and computer engineering, along with master’s degree students and graduates from Bos’ lab: Akhil Kurup, Derek Chopp and Zach Jeffries. Bos explains that lidar, infrared and other sensors on their own are like the hammer in an old adage. “‘To a hammer, everything looks like a nail,'” quoted Bos. “Well, if you have a screwdriver and a rivet gun, then you have more options.”

Most autonomous sensors and self-driving algorithms are being developed in sunny, clear landscapes. Knowing that the rest of the world is not like Arizona or southern California, Bos’s lab began collecting local data in a Michigan Tech autonomous vehicle (safely driven by a human) during heavy snowfall. Rawashdeh’s team, notably Abu-Alrub, poured over more than 1,000 frames of lidar, radar and image data from snowy roads in Germany and Norway to start teaching their AI program what snow looks like and how to see past it.

“All snow is not created equal,” Bos said, pointing out that the variety of snow makes sensor detection a challenge. Rawashdeh added that pre-processing the data and ensuring accurate labeling is an important step to ensure accuracy and safety: “AI is like a chef — if you have good ingredients, there will be an excellent meal,” he said. “Give the AI learning network dirty sensor data and you’ll get a bad result.”

Low-quality data is one problem and so is actual dirt. Much like road grime, snow buildup on the sensors is a solvable but bothersome issue. Once the view is clear, autonomous vehicle sensors are still not always in agreement about detecting obstacles. Bos mentioned a great example of discovering a deer while cleaning up locally gathered data. Lidar said that blob was nothing (30% chance of an obstacle), the camera saw it like a sleepy human at the wheel (50% chance), and the infrared sensor shouted WHOA (90% sure that is a deer).

Getting the sensors and their risk assessments to talk and learn from each other is like the Indian parable of three blind men who find an elephant: each touches a different part of the elephant — the creature’s ear, trunk and leg — and comes to a different conclusion about what kind of animal it is. Using sensor fusion, Rawashdeh and Bos want autonomous sensors to collectively figure out the answer — be it elephant, deer or snowbank. As Bos puts it, “Rather than strictly voting, by using sensor fusion we will come up with a new estimate.”

While navigating a Keweenaw blizzard is a ways out for autonomous vehicles, their sensors can get better at learning about bad weather and, with advances like sensor fusion, will be able to drive safely on snowy roads one day.

###

Media Contact
Stefanie Sidortsova
[email protected]

Original Source

https://www.mtu.edu/news/stories/2021/may/driving-in-the-snow-is-a-team-effort-for-ai-sensors.html

Tags: Computer ScienceElectrical Engineering/ElectronicsMultimedia/Networking/Interface DesignSoftware EngineeringTechnology/Engineering/Computer ScienceTheory/DesignVehicles
Share12Tweet8Share2ShareShareShare2

Related Posts

Terahertz Spectroscopy Maps Buried PN Junction Depths

Terahertz Spectroscopy Maps Buried PN Junction Depths

June 20, 2025
blank

Revolutionizing Rehabilitation: Virtual Reality Offers New Hope for Stroke Survivors to Recover Movement

June 20, 2025

Innovative Nanoparticles Enable Safer, More Efficient Drug Delivery

June 20, 2025

Reversible Control of Polymer Linear Conjugation

June 20, 2025
Please login to join discussion

POPULAR NEWS

  • Green brake lights in the front could reduce accidents

    Study from TU Graz Reveals Front Brake Lights Could Drastically Diminish Road Accident Rates

    161 shares
    Share 64 Tweet 40
  • New Study Uncovers Unexpected Side Effects of High-Dose Radiation Therapy

    76 shares
    Share 30 Tweet 19
  • Pancreatic Cancer Vaccines Eradicate Disease in Preclinical Studies

    71 shares
    Share 28 Tweet 18
  • How Scientists Unraveled the Mystery Behind the Gigantic Size of Extinct Ground Sloths—and What Led to Their Demise

    65 shares
    Share 26 Tweet 16

About

We bring you the latest biotechnology news from best research centers and universities around the world. Check our website.

Follow us

Recent News

Terahertz Spectroscopy Maps Buried PN Junction Depths

Revolutionizing Rehabilitation: Virtual Reality Offers New Hope for Stroke Survivors to Recover Movement

Innovative Nanoparticles Enable Safer, More Efficient Drug Delivery

  • Contact Us

Bioengineer.org © Copyright 2023 All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Homepages
    • Home Page 1
    • Home Page 2
  • News
  • National
  • Business
  • Health
  • Lifestyle
  • Science

Bioengineer.org © Copyright 2023 All Rights Reserved.