For NASA, Artificial Intelligence Is Key to Navigating the Cosmos

For NASA, Artificial Intelligence Is Key to Navigating the Cosmos

NASA’s projects have become increasingly ambitious in terms of robotics and data processing in recent years, most notably through its long-range probes and Mars rovers. However, as Shreyansh Daftry (a NASA AI research scientist) noted in a recent lecture for the 2021 AI Hardware Summit, there is still a ways to go – and most of it must be driven by AI.

“[This is] what I always wanted to do: become a robotics engineer that could build intelligent machines in space,” Daftry said. “But as I started to learn more, I understood that while NASA has been building these really capable machines, they’re not really what I, as a trained roboticist, would call intelligent.” Daftry compared the 1969 lunar lander to the Curiosity rover, pointing out that both were operated by humans during landing as well as operation.

The Human-Led Status Quo Must Be Changed

But, he argued, that worldview would have to shift for two reasons. “First,” Daftry said, “is the difficulty with deep-space communications,” which are both extremely bandwidth constrained and function with considerable – and rising – latency with distance With NASA contemplating trips to destinations like Europa, which would take hours to receive and reply to a signal, using human control would’ve been prohibitively inefficient.

“The second major reason,” he continued, “is scalability.” He displayed a map of the Earth’s satellites and a layout of a potential Mars colony, asking the public to picture all of those different equipment being managed by their own groups of scientists and engineers. “Sounds crazy, right?” he said. “The only way we can scale up the space economy is if we can make our space assets to be self-sustainable, and artificial intelligence is going to be a key ingredient in making that happen.”

Daftry said that AI may make a significant difference in a variety of fields, including autonomous science as well as planning, precise landing, dexterous manipulation, human-robot collaborations, and others. But, he added, for the sake of his lecture, he would concentrate on only one: autonomous navigation.

Leading the Way On Mars (Safely)

“We landed another rover on Mars. Woohoo!” he said. “The Perseverance rover has the most sophisticated autonomous navigation system, ENav, that has ever driven on any extraterrestrial surface.” He displayed a video of the system navigating a rough terrain obstacle course. “However, if you carefully look at the video captions, you’ll notice that the video was sped up fifty times.” He said that the lunar buggy did the same thing at 100 times the speed 50 years before. “The key difference between what Perseverance does and what was happening here,” he said, “is [that] the intelligence of the rover was powered by the human brain.” 

As a result, Daftry and his team have been working on building autonomous systems that will allow the space probes to drive much like humans. “The current onboard autonomous navigation system running on Curiosity and also on Perseverance uses only geometric information,” He said that the algorithms fail to catch textural distinctions, which may be the difference between securely traveling and becoming trapped in something like sand. “So we created SPOC, a deep learning-based terrain-aware classifier that can help Mars rovers navigate better.”

“Just like humans, SPOC can tell the terrain types based on textural features,” Daftry continued. The researchers created two versions: one that works onboard a rover and has six classification categories for the ground texture, and another that works on orbital pictures and has 17 categorization types. SPOC, according to Daftry, was inspired by the “ongoing revolution” in computer vision and deep learning. However, the deep learning algorithms utilized were designed for usage on Earth, not Mars. 

Stumbling Blocks

“Adapting these models to work reliably in a Martian environment,” Daftry said, “is not trivial.” The team had used transfer learning to close the gap, but there have been a few significant obstacles. The first issue, he stated, was data availability. 

“We did not have any labeled data to start with,” he explained. “And the fact that we have only about two dozen geologists in the world who have the knowledge to create these labels – as you can imagine, it is really hard to get time from these people to actually do manual labeling.” Even if they did, the expense would have been too high.

Instead, the team launched AI4Mars, a citizen science initiative that tasked volunteers with categorizing photos from the rovers, successfully labeling over 200,000 images in just a few months. However, the data they began with was also highly noisy, so the researchers resorted to synthetic data and simulation.

“We worked with a startup called Bifrost AI who excels at generating perfectly labeled data of complex unstructured environments using 3D graphics,” Daftry said. “The data generated is highly photorealistic and looks just like Mars. In addition, the labels are perfect and noise-free.”

The second constraint is hardware restrictions, with Daftry equating Perseverance’s onboard computer to an iMac G3 from the late 1990s. He stated that NASA was looking at a variety of options to enable deep learning models to run onboard rovers, ranging from designing in-house hardware to modifying common CPUs like Qualcomm’s Snapdragon to function on another planet.

The next step, according to Daftry, was system integration and verification. “Once the rover is launched, we do not have the ability to fix things if something goes wrong, so things have to work in one shot.” This was an issue for deep learning. “Deep [convolutional neural network] models, while they work great, are inherently black-box in nature,” he said, “presenting a major bottleneck for system-level validation and proving guaranteed performance.” This was mainly solved by “testing the heck out of our system” on Earth in analogous settings, he noted.

The Next Step.

“Overall, our long-term vision is to create something like Google Maps on Mars for our rovers,” Daftry said, “so that the human operator can only specify the destination they need the rover to go and the rover uses the software to find its path.” Aside from that, NASA was investigating autonomous navigation for a variety of different types of rovers, ranging from deep-sea rovers to cliff-scaling rovers.

Leave a Reply

Your email address will not be published. Required fields are marked *

Get in Touch To get Free Demo

We are available 24 * 7, Contact Us and Avail Exciting Discount Offers​

    WhatsApp Now