NASA –Artificial Intelligence is Changing How We Explore Mars & Beyond



A glimpse of how we’ll explore the cosmos in the future is here: A.I. software on NASA’s Curiosity Mars rover has helped it zap dozens of laser targets on the Red Planet this past year, becoming a frequent science tool when the ground team was out of contact with the spacecraft. This same software has proven useful enough that it’s already scheduled for NASA’s upcoming mission, Mars 2020. Adding intelligence to robotic probes could enable probes sent to targets as far away Proxima b in the Alpha Centauri star system. Many of the world’s leading astrophysicists believe that our first discovery of advanced alien civilizations will be in the form of machine AI, not biological.

“My guess is that if we do detect an alien intelligence, it will be nothing like us. It will be some sort of electronic entity, says astrophysicist Martin Rees. “

If we look at our history on Earth, it has taken about 4 billion years to get from the first protozoa to our current, technological civilisation. But if we look into the future, then it’s quite likely that within a few centuries, machines will have taken over – and they will then have billions of years ahead of them.

“In other words,” concludes Rees, “the period of time occupied by organic intelligence is just a thin sliver between early life and the long era of the machines. Because such civilisations would develop at different rates, it’s extremely unlikely that we will find intelligent life at the same stage of development as us. More likely, that life will still be either far simpler, or an already fully electronic intelligence.”

Back to the present and NASA’s use of AI to explore Mars: a new paper in Science: Robotics looks at how the software has performed since rolling out to Curiosity’s science team in May 2016. The AEGIS software, or Autonomous Exploration for Gathering Increased Science, has been used to direct the Curiosity rover’s ChemCam instrument 54 times since then. It’s used on almost every drive when the power resources are available for it, according to the paper’s authors.

The vast majority of those uses involved selecting targets to zap with ChemCam’s laser, which vaporizes small amounts of rock or soil and studies the gas that burns off. Spectrographic analysis of this gas can reveal the elements that make up each laser target.

AEGIS allows the rover to get more science done while Curiosity’s human controllers are out of contact. Each day, they program a list of commands for it to execute based on the previous day’s images and data. If those commands include a drive, the rover may reach new surroundings several hours before it is able to receive new instructions. AEGIS allows it to autonomously zap rocks that scientists may want to investigate later.

“Time is precious on Mars,” said lead author Raymond Francis of NASA’s Jet Propulsion Laboratory in Pasadena, California. Francis is the lead system engineer for AEGIS’ deployment on the Curiosity rover. “AEGIS allows us to make use of time that otherwise wasn’t available because we were waiting for someone on Earth to make a decision.”

AEGIS has helped the science team discover a number of interesting minerals. On separate occasions, higher quantities of chlorine and silica were discovered in nearby rocks — information that helped direct science planning the following day.

“The goal is to provide more information for the science team,” said Tara Estlin of JPL, co-author and team lead for AEGIS. “AEGIS has increased the total data coming from ChemCam by operating during times when the rover would otherwise just be waiting for a command.”

Before AEGIS was implemented, this downtime was so valuable that the rover was instructed to carry out “blind” targeting of ChemCam. As it was carrying out commands, it would also fire the laser, just to see if it would gather interesting data. But the targeting was limited to a pre-programmed angle, since there was no onboard ability to search for a target.

“Half the time it would just hit soil — which was also useful, but rock measurements are much more interesting to our scientists,” Francis said.

With the intelligent targeting AEGIS affords, Curiosity can be given parameters for very specific kinds of rocks, defined by color, shape and size. The software uses computer vision to search out edges in the landscape; if it detects enough edges, there’s a good chance it has found a distinct object, Francis said.

Then the software can rank, filter and prioritize those objects based on the characteristics the science team is looking for.

AEGIS can also be used for fine-scale pointing — what Francis calls “pointing insurance.” When Curiosity’s operators aren’t quite confident they’ll hit a very narrow vein in a rock on the first try, they sometimes use this ability to fine-tune the pointing, though it only came up twice in the past year.

The upcoming Mars 2020 rover will also include AEGIS, which will be included in the next-generation version of ChemCam, called SuperCam. That instrument will also be able to use AEGIS for a remote RAMAN spectrometer that can study the crystal structures of rocks, as well as a visible and infrared spectrometer.

The U.S. Department of Energy’s Los Alamos National Laboratory in New Mexico leads the U.S. and French team that jointly developed and operates ChemCam. IRAP is a co-developer and shares operation of the instrument with France’s national space agency (CNES), NASA and Los Alamos. JPL, a division of Caltech in Pasadena, California, manages the Curiosity mission for NASA.

The Daily Galaxy via NASA

"The Galaxy" in Your Inbox, Free, Daily