Skip to content
Home » New Automated Technology Allows Unparalleled Space Exploration

New Automated Technology Allows Unparalleled Space Exploration

Source :

When landing Apollo 11 in 1969, astronauts looked out the window for distinguishing features that they recognized from maps of the Moon and were ready to steer the lander to avoid a disastrous touchdown on top of a rocky area. Now 50 years later, the process is often automated. Distinguishing features like known craters, boulders or other unique surface characteristics provide insight into surface hazards to help & to avoid them while landing.

NASA scientists & engineers are maturing technology for navigating & landing on planetary bodies by analyzing images during descent, a process called Terrain Relative Navigation (TRN). This optical navigation technology is included on NASA’s newest Mars-rover, Perseverance which test TRN when it lands on the Mars in 2021, paving the way for future crewed missions to the Moon & beyond. TRN was also getting used during NASA’s recent Origins, Spectral Interpretation, Resources Identification, Security, Regolith Explorer (OSIRIS-REx) mission Touch-And-Go (TAG) event to gather samples of the asteroid Bennu so as to understand the characteristics & movement of asteroids.

Since reaching Bennu in 2018, the OSIRIS-REx spacecraft has mapped and studied its surface including its topography & lighting conditions, in preparation for TAG. Nightingale crater was chosen from four candidate sites supported its large amount of sampleable material & accessibility for the spacecraft.

Engineers routinely use ground-based optical navigation methods to navigate the OSIRIS-REx spacecraft near Bennu where new images taken by the spacecraft are compared to 3D-topographic maps. During TAG, OSIRIS-REx performed an identical optical navigation process onboard in real-time, using a TRN system referred to as Natural Feature Tracking. Images were taken of the sample site during TAG descent compared with onboard topographic maps and the spacecraft trajectory was re-adjusted to focus on the landing site. Optical navigation could even be utilized in the longer-term to attenuate the risks related to landing in other unfamiliar environments in our system.

NASA’s Lunar Reconnaissance Orbiter (LRO) has acquired images from orbit since 2009. LRO Project Scientist Noah Petro said one challenge to prepare for landed missions is lack of high-resolution, narrow-angle camera images at every lighting condition for any specific landing site. These images would be useful for automated landing systems which require the illumination data for a selected time of day. However, NASA has been ready to collect high-resolution topographic data using LRO’s Lunar Orbiter Laser Altimeter (LOLA).

“LOLA data & other topographic data, allow us to take the form of the Moon & shine a light thereon for any time in the future or past and thereupon we will predict what the surface will look like” Petro said.

Using LOLA data, sun angles are overlaid on a 3D elevation map to model shadows of surface features at specific dates & times. NASA scientists know the position & orientation of the Moon & LRO in space, having taken billions of lunar laser measurements. Over the time, these measurements are compiled into a grid-map of the lunar surface. Images taken during landing are compared to the master map in order that landers can be used as a part of the Artemis program have another tool to securely navigate the lunar terrain.

The lunar surface is sort of a fingerprint, Petro said, where no 2 landscapes are identical. Topography can be used to determine spacecraft’s exact location above the Moon, comparing images sort of a forensic scientist compares fingerprints from crime scenes to match a known person to an unknown person or to match a location to where the spacecraft is in its flight.

After landing, TRN are often used on ground to assist astronauts to navigate crewed rovers. As a part of NASA’s lunar surface sustainability concept, the agency is considering to use a habitable mobility platform such as RV also as a Lunar Terrain Vehicle (LTV) to assist crew travel on the lunar surface.

Astronauts can typically travel short distances of a couple of miles in an unpressurized rover just like the LTV goodbye as they need landmarks to guide them. However, traveling greater distances is far tougher, to not mention the Sun at the lunar South Pole is usually low on the horizon, adding to visibility challenges. Driving across the South Pole would be like driving a car straight east very first thing in the morning, the light can be often blinding & landmarks can appear distorted. With TRN, astronauts could also be better ready to navigate the South Pole despite the lighting conditions because the computer may better detect hazards.

Speed is the key difference between using TRN to land a spacecraft & using it to navigate a crewed rover. Landing requires capturing & processing images faster, with as short together second intervals between images. To bridge the gap between images, onboard processors keep the spacecraft on target to securely land.

“When you move slower like with rovers or OSIRIS-REx orbiting round the asteroid, you’ve got longer to process the pictures” said Carolina Restrepo, an engineer at NASA Goddard in Maryland working to enhance current data products for the lunar surface. “When you’re moving very fast, descent & landing, there’s no time for this. You would need to be taking images & processing them as fast as possible aboard the spacecraft & it must be all autonomous.”

Automated TRN solutions can address the requirements of human & robotic explorers as they navigate unique locations in our system like the optical navigation challenges faced by OSIRIS-REx for append Bennu’s rocky surface. Due to missions like LRO, Artemis astronauts can use TRN algorithms & lunar topography data to supplement images of the surface so as to land and safely explore the Moon’s South Pole.

“What we’re trying to anticipate the requirements of future terrain relative navigation systems by combining existing data types to form sure we will build the highest-resolution maps for key locations along future trajectories & landing sites” Restrepo said. “In other words, we’d need high-resolution maps both for scientific purposes also as for navigation.”