The Mars Autonomy Project

Mars Autonomy
Technology
    Obstacle Avoidance
Path Planning
Integration
Perception
Publications
Movies
People
Contact us
Related Sites
Internal

 

 
The Carnegie Mellon
Field Robotics Center
The Robotics Institute
Carnegie Mellon University
The Jet Propulsion Laboratory
Long-Range Rover
ET Rover
NASA Ames
Autonomous Systems

Overview
To achieve the ambitious science goals of future Mars missions, the accompanying rovers must be highly capable and autonomous. They must be able to navigate, especially between sites, with minimal human intervention. They must be able to detect anomalies and deal with them effectively. They must be able to manage their limited resources, including power and computation, and use them in an efficient manner. Finally, they must integrate all these capabilities into a working, reliable system.

Our project, a part of the NASA Intelligent Robotics Program, is focused on the area of autonomous navigation. We are integrating previously developed local obstacle avoidance and global path planning algorithms and adapting them to a Mars-relevant rover in order to demonstrate reliable long-distance navigation (100-200 meters without the need for human intervention) in Mars-like terrain.

The Mars Autonomy program will demonstrate navigation on a vehicle of the scale identical to that of the FIDO rover that is baselined for a flight mission in 2005. Using a stereo vision algorithm developed at JPL, we will demonstrate collision avoidance and route planning in Mars-like terrain. Future issues involve long range route planning in the presence of position uncertainty, efficient search and exploration, rover localization with computer vision, and effective human-robot interfaces.

 

Autonomy Packages
Obstacle Avoidance Screen Shot

 

Local Obstacle Avoidance
The obstacle avoidance module considers the set of arcs along which the rover could move for the next few meters. By integrating the terrain roughness along each arc, it decides which paths are safe and eliminates others from consideration. Because this is a local algorithm, the maps it uses can have high resolution, and it can run more often than global algorithms.
More: [ Morphin Map | Steering Evaluation ]

 

Path Planner Screen Shot

 

Global Path Planning
We use D*, a heuristic path planning algorithm based on A*. Its principal innovation is that it can partially reuse old plans when presented incrementally with new information. This makes it ideal for real-time sensor-based planning on a mobile platform.
More: [ D* | Uncertainty | Sample Run | Future Work ]

 

Mars Autonomy System Architecture

 

Integration
The major challenge of the Mars Autonomy Project is integrating the various modules making up our autonomy system. For instance, the steering votes of our independent obstacle avoidance and path planning modules must be combined in a module we call the arbiter.

More: [ Architecture | Arbiter | Networking ]

 

Perception and Mapping
Stereo Disparity Map

 

We're using a stereo algorithm developed at JPL to convert stereo pairs from our cameras to a cloud of (x,y,z) points on the surface of the terrain ahead. Our traversability mapper reduces this surface to a map we can use for obstacle avoidance and path planning.
More: [ Calibration | Matching | Traversability ]

 

The Rovers
The Bullwinkle Rover

 

The Bullwinkle Test Bed
Our test rover, Bullwinkle, is an RWI Inc. ATRV-II, selected because its similar size gives it mobility and vision problems like those of the next-generation Mars rover, although its rigid suspension and four skid-steered wheels are quite different. Like the Mars rover, it uses two forward-pointing cameras for obstacle detection. We have achieved a 100 meter autonomous traverse at 15 cm/s.
Visit RWI for more information

 

Next-Generation Mars Rover

 

Next Generation Mars Rover
Technology for a next-generation Mars rover is currently under development at JPL. Its rocker bogie suspension gives it a similar appearance to the Mars Pathfinder Sojourner rover. It extends Sojourner's capabilities by adding a mast that enables long-range sensing, a manipulator arm for sample collection, faster on-board computing, and a larger chassis capable of longer traverses.
Visit JPL for more information

 

The Simulator
Simulator Screen Shot

 

We have developed a simulator that allows us to test robot planning and architectural issues. It simulates terrain, robot mechanisms and range sensors such as stereo vision. The simulator allows us to test algorithms in repeatable terrain and vehicle configurations. It also lets us test in environments that are not easy to create physically. The system that drives our robot testbed can be attached to the simulator through an identical interface.
Download movies generated by the simulator.

 


Last modified: $Date: 1999/06/28 18:04:26 $