Your credit card company contacts you asking if you've purchased something from a retailer you don't normally patronize or spent more than usual. A human didn't identify the atypical transaction. A computer — equipped with advanced algorithms — tagged the potentially fraudulent purchase and triggered the inquiry.
Researchers at NASA's Goddard Space Flight Center in Greenbelt, Maryland, think scientists and engineers could benefit from the same technology, often referred to as machine learning or neural networks.
Considered a subset of artificial intelligence, machine learning and neural networks are actually in the avant-garde. Instead of programming a computer to carry out every task it needs to do, the philosophy behind machine learning is to equip ground- or space-based computer processors with algorithms that, like humans, learn from data, finding and recognizing patterns and trends, but faster, more accurately, and without bias.
Wide-Ranging Applications
"The benefits are many and the applications are wide ranging," said Goddard Senior Fellow and Assistant Chief for Technology Jacqueline Le Moigne, who has been working in artificial intelligence since her graduate school days in France several years ago.
"Scientists could use machine learning to analyze the petabytes of data NASA has already collected over the years, extracting new patterns and new correlations and eventually leading to new science discoveries," she said. "It could also help us monitor the health of a spacecraft, avoid and recover from catastrophic failures, and prevent collisions. It could even assist engineers, providing a wide range of knowledge about past missions — information they would need in designing new missions."
With funding from several NASA research programs, including the Earth Science Technology Office, or ESTO, Goddard engineers and scientists are researching some of those applications individually or in partnerships with academia and private industry. Their projects run the gamut, everything from how machine learning could help in making real-time crop forecasts or locating wildfires and floods to identifying instrument anomalies and even suitable landing sites for a robotic craft.
"People hear artificial intelligence and their minds instantly go to science fiction with machines taking over, but really it's just another tool in our data-analysis toolbox and definitely one we shouldn't neglect because of preconceived notions," said James MacKinnon, a Goddard computer engineer who is involved in several projects involving artificial intelligence.
Finding Fires
Since joining Goddard a couple years ago, MacKinnon has emerged as one of the technology's most fervent champions. One of the first projects he tackled involved teaching algorithms how to identify wildfires using remote-sensing images collected by the Terra spacecraft's Moderate Resolution Imaging Spectroradiometer instrument. His neural network accurately detected fires 99 percent of the time. He has since expanded the research to include data gathered by the Joint Polar Satellite System's Visible Infrared Imaging Radiometer Suite.
His dream is to ultimately deploy a constellation of CubeSats, all equipped with machine-learning algorithms embedded within sensors. With such a capability, the sensors could identify wildfires and send the data back to Earth in real time, providing firefighters and others with up-to-date information that could dramatically improve firefighting efforts. "The key here is processing the data onboard, not only for wildfires but for floods. There are a lot of things you could do with this capability," he said.
He is also developing machine-learning techniques to identify single-event upsets in spaceborne electronic devices, which can result in data anomalies, and compiling a library of machine-learning computer models, dataset-generation tools, and visualization aids to make it easier for others to use machine-learning techniques for their missions, he said.
"A huge chunk of my time has been spent convincing scientists that these are valid methods for analyzing the massive amounts of data we generate," he said.
Cutting Through the Noise
Goddard scientist Matt McGill doesn't need convincing. An expert in lidar techniques to measure clouds and the tiny particles that make up haze, dust, air pollutants and smoke, McGill is partnering with Slingshot Aerospace. This California-based company is developing platforms that pull data from many types of sensors and use machine-learning algorithms to extract information.
Under the ESTO-funded effort, McGill is providing Slingshot with data he gathered with the Cloud-Aerosol Transport System, or CATS, instrument, which retired late last year after spending 33 months aboard the International Space Station. There, CATS measured the vertical structure of clouds and aerosols, which occur naturally during volcanic eruptions and dust storms or anthropogenically through the burning of oil, coal, and wood. A Slingshot-developed machine-learning algorithm is ingesting that data so that it can learn and ultimately begin to recognize patterns, trends, and occurrences that are difficult to capture with standardized processing algorithms.
McGill is particularly interested in seeing whether machine-learning techniques can filter out the noise that is common in lidar measurements. Although humans already cull noise from data, current techniques are time-consuming and can take days to accomplish — antithetical to the goal of distributing intelligence in real-time. "The idea is that algorithms, once trained, can recognize signals in hours rather than days," McGill said.
Just as important, at least to McGill, is the need to miniaturize CATS-like lidar systems. While CATS was roughly the size of a refrigerator, future systems must be much smaller, capable of flying on a constellation of SmallSats to collect simultaneous, multipoint measurements. However, as instruments get smaller, the data can potentially be noisier due to smaller collection apertures, McGill explained. "We have to get smarter in how we analyze our data and we need to develop the capability to generate true real-time data products."
Dolphin Stranding
Getting smarter in data analysis is also driving Goddard heliophysicist Antti Pulkkinen and engineer Ron Zellar.
A couple years ago, Pulkkinen began investigating whether solar storms were causing otherwise healthy whales, dolphins, and porpoises — collectively known as cetaceans — to strand along coastal areas worldwide. While he and his team found no correlation, they did find a link between stranding events in Cape Cod, Massachusetts, and wind strength.
Is it possible that strong winds, which occur during the winter months when dolphins are more likely to beach, stir ocean phytoplankton and other nutrients that feed fish? Are the dolphins simply following their food source? "We can't assume a causal relationship," said Zellar, who, when not working on this project, serves as a mission-systems engineer on the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer, or OSIRIS-REx, mission. "That's what we're trying to find."
With funding from the Goddard Fellows Innovation Challenge, a program that funds the development of potentially revolutionary technologies, the team is applying machine-learning techniques to delve more deeply into environmental data to see if they can prove a cause.
Cutting the Umbilical Cord
In November, the OSIRIS-REx mission is scheduled to begin a series of complex maneuvers that take the craft closer to asteroid Bennu so that it can begin characterizing the body and snapping images that will inform the best location for collecting a sample and returning it to Earth for analysis. This will require thousands of high-resolution images taken from different angles and then processed manually by a team of experts on the ground.
Scientists want to simplify and hasten the processing time. Under a NASA-funded research effort involving Goddard scientists, Dante Lauretta, a University of Arizona professor and OSIRIS-REx principal investigator, and Chris Adami, a machine-learning expert at Michigan State University, a team is investigating the potential of networked algorithms. The goal is to teach onboard sensors to process images and determine an asteroid's shape and features — information needed to autonomously navigate in and around an asteroid and make decisions on where to safely acquire samples.
"The point is to cut the computational umbilical cord back to Earth," said Bill Cutlip, a Goddard senior business development manager and team member. "What we're trying to do is train an algorithm to understand what it's seeing, mimicking how the human brain processes information."
Such a capability not only would benefit future missions to asteroids, but also those to Mars and the icy moons of Jupiter and Saturn, he said. With advances in field-programmable gate arrays or circuits that can be programmed to perform a specific task and graphics-processing units, the potential is staggering, he added.
###
For more Goddard technology news, go to: https://www.nasa.gov/sites/default/files/atoms/files/fall_2018_final_web_version.pdf
Lori Keesey, NASA's Goddard Space Flight Center
Media Contact
Lori Keesey
[email protected]
@NASAGoddard
http://www.nasa.gov/goddard
Original Source
https://www.nasa.gov/feature/goddard/2018/nasa-researchers-teach-machines-to-see