The Mars Autonomy Project unifies a stereo vision module, a mapping
module, local obstacle avoidance, path planning, and low-level control.
Each of these modules is a major piece of software in its own right, and
they were developed in the context of several projects over the past
decade. Designing and implementing a system that includes them all is
the major challenge of the project.|
The diagram above shows our system architecture when we run on the
Bullwinkle rover test bed. The red ovals are hardware, and the green
rectangles are modules. Each module runs as a separate process (on a
separate machine if necessary), and the links shown between modules
represent network messages.|
We also have the ability to run in two other modes which do not require
- Simulator mode. In this mode, our simulator accepts
commands from the controller and returns pose. It also sends
range points to the Stereo Map traversability mapper
(we bypass the stereo algorithm in order to avoid
the overhead of generating simulated stereo image pairs).
- Series-reader mode. In this mode, we replay a previously
recorded series of stereo pairs to the system. Each pair is
stamped with the pose at the time it was recorded. Low-level
commands from the controller are ignored (in this case there
is no way to simulate their effects).
The arbiter, developed over the past several years, is responsible for
collecting asynchronous votes from the different modules which have a
say in high-level control of the rover, then choosing the best action
based on those votes.
In our case, the possible actions are forward moves with different
curvatures to the left and right. We also have the ability to turn in
place, either left or right. The two voting modules are the local
obstacle avoidance and global path planning systems.
Each voting module assigns either a veto or a vote between 0 and 1 to
each command. This vote is multiplied by the module's weight in the
arbiter. The unvetoed command with the highest weighted sum is sent to
Many of the arbiter's capabilities, such as smoothing in the time domain
and interpolating in between the discrete voting bins to achieve finer
control, were developed for other projects and go unused in our system.
Most useful to us among its advanced features is a GUI which allows us
to visualize the votes coming in from each module for debugging.
We use the Real-Time Communications (RTC) package developed at
Carnegie Mellon's National Robotics Engineering Consortium (NREC).
In RTC, the locations of individual modules are tracked by a central
server. Modules can register and unregister on the fly. Once the
location of the recipient is known to the sender, messages are passed
point-to-point. The system supports a limited form of publish-subscribe
Visit the RTC web site.