High Resolution Maps from Wide Angle Sonar

 Hans P. Moravec
 Alberto Elfes

 The Robotics Institute
 Carnegie-Mellon University

 July 1984


@begin(ResearchCredit)

This work has been supported in part by Denning Mobile Robotics, Inc.,
by the Western Pennsylvania Advanced Technology Center and by the
Office of Naval Research under contract number N00014-81-K-0503.  The
second author is supported in part by the Conselho Nacional de
Desenvolvimento Cientifico e Tecnologico - CNPq, Brazil, under
Grant 200.986-80; in part by the Instituto Tecnologico de
Aeronautica - ITA, Brazil; and in part by The Robotics Institute,
Carnegie-Mellon University.

The views and conclusions contained in this document are those of the
authors and should not be interpreted as representing the official
policies, either expressed or implied, of the funding agencies.
@end(ResearchCredit)
@end(titlepage)

@subheading(Abstract)

We describe the use of multiple wide-angle sonar range measurements to
map the surroundings of an autonomous mobile robot. A sonar range
reading provides information concerning @r(empty) and @r(occupied)
volumes in a cone (subtending 30 degrees in our case) in front of the
sensor. The reading is modelled as probability profiles projected onto
a rasterized map, where somewhere occupied and everywhere empty areas
are represented. Range measurements from multiple points of view
(taken from multiple sensors on the robot, and from the same sensors
after robot moves) are systematically integrated in the map.
Overlapping empty volumes re-inforce each other, and serve to condense
the range of occupied volumes. The map definition improves as more
readings are added. The final map shows regions probably occupied,
probably unoccupied, and unknown areas. The method deals effectively
with clutter, and can be used for motion planning and for extended
landmark recognition.  This system has been tested on the @r(Neptune)
mobile robot at CMU.



@section(Introduction)

This paper describes a sonar-based mapping system developed for mobile
robot navigation. It was tested in cluttered environments using the
@i(Neptune) mobile robot @cite(neptune), developed at the Mobile Robot
Laboratory of the Robotics Institute, CMU. The Neptune system has been
used successfully in several areas of research, including stereo
vision navigation @cite(matthies84, thorpe83) and path planning
@cite(thorpe84). Other research in the laboratory includes the
investigation of adequate high-level robot control structures, the use
of distributed and parallel processing methods to improve the
real-time response of the system, navigation in outdoor environments
and the design and construction of more advanced robots with higher
mobility.

Primarily because of computational expense, practical real-world
stereo vision navigation systems @cite(moravec80, thorpe83) build very
sparse depth maps of their surroundings.  Even with this economy our
fastest system, described in @cite(matthies84), takes 30 - 60 seconds
per one meter step on a 1 mips machine.

Direct sonar range measurements promised to provide basic navigation
and denser maps with considerably less computation.  The readily
available Polaroid ultrasonic range transducer @cite(polaroid) was
selected and a ring of 24 of these sensors was mounted on Neptune.

We find sonar sensors interesting also because we would like to
investigate how qualitatively different sensors, such as a sonar array
and a pair of cameras, could cooperate in building up a more complex
and rich description of the robot's environment.

@subsection(Goals)

We expected sonar measurements to provide maps of the robot's
environment with regions classified as empty, occupied or unknown, and
matches of new maps with old ones for landmark classification and to
obtain or correct global position and orientation information.

@subsection(Approach)

Our method starts with a number of range measurements obtained from
sonar units whose position with respect to one another is known.  Each
measurement provides information about empty and possibly occupied
volumes in the space subtended by the beam (a thirty degree cone for
the present sensors).  This occupancy information is projected onto a
rasterized two-dimensional horizontal map. Sets of readings taken both
from different sensors and from different positions of the robot are
progressively incorporated into the sonar map. As more readings are
added the area deduced to be empty expands, and the expanding empty
area encroaches on and sharpens the possibly occupied region.  The map
becomes gradually more detailed.

For navigation and recognition we developed a way of convolving two
sonar maps, giving the displacement and rotation that best brings one
map into registration with the other, along with a measure of the
goodness of the match.

The sonar maps happen to be very useful for motion planning. They are
denser than those made by our stereo vision programs, and
computationally about an order of magnitude faster to produce.  We
presently use them with the Path Relaxation method @cite(thorpe84) to
plan local paths for our robot.

@subsection(Related Work)

Sonar is a developed technology but few applications until recently
involved detailed map building.  Traditional marine applications,
camera autofocus systems, and some simple robot navigation schemes
@cite(chattergy, miller) rely on sparse proximity measurements to
accomplish their narrow goals.

The most advanced sonar systems used in marine intelligence operations
locate sound sources passively @cite(baggeroer).  Ultrasound systems
used in medicine are typically active and build maps for human
perusal, but depend on accurate physical models of the environments
that the sound traverses @cite(hussey), and work with very small beam
widths, about 1@+(o) - 3@+(o).  Narrow beam widths, formed by phased
array techniques, are also used in advanced side looking mapping sonar
system for submersibles. An independent CMU sonar mapping effort
@cite(crowley) also used a narrow beam, formed by a parabolic
reflector, in its attempts to build a line-based description.

In contrast the sonar sensors that we choose to employ have a wide
beam, with an effective angle of about 30@+(o).

@section(The Sonar System)

@subsection(The Sensor)

The sonar devices being used are Polaroid laboratory grade ultrasonic
transducers @cite(polaroid). These sonar elements have a useful
measuring range of 0.9 to 35.0 ft.  The main lobe of the sensitivity
function corresponds to a beam angle of 30@+(o) at @m{-38} dB.
Experimental results showed that the range accuracy of the sensors is
on the order of 1 %.  We are using the control circuitry provided with
the unit, which is optimized for giving the range of the nearest sound
reflector in its field of view, and works well for this purpose.

@subsection(The Array)

The sonar array, built at Denning Mobile Robotics, and mounted on the
 @i(Neptune) mobile robot is composed of:

@begin(itemize)
A ring of 24 Polaroid sonar elements, spaced 15@+(o) apart, and
mounted at an height of 31 inches above the ground (see Fig. 
@ref(Neptunerobot)).

A Z80 controlling microprocessor which selects and fires
the sensors, times the returns and provides a range value.

A serial line over which range information is sent to a VAX
mainframe that presently interprets the sonar data and performs
the higher level mapping and navigation functions.
@end(itemize)

@begin(fullpagefigure)
@blankspace(8 in)


@caption[The @i(Neptune) mobile robot, with a pair of cameras and the
sonar ring.]

@tag(Neptunerobot)
@end(fullpagefigure)

@section(Sonar Mapping)

@subsection(Obtaining Reliable Range Data from the Sonar Sensor)

We begin our map building by preprocessing the incoming readings to
remove chronic errors. The following steps are used:

@begin(itemize)

@b(Thresholding): Range readings above a certain maximum @i(R@-(u))
are discarded. We observe that sonar readings caused by specular
reflections are often near the maximum range of the device
(@i(R@-(max))). With @i(R)@-(u) slightly below @i(R@-(max)), many of
these readings are discarded.  The system becomes slightly myopic, but
the overall quality of the map improves.  Very large open spaces are
detected by analyzing the set of distance values obtained from the
sonar, and in this case the filtering is not done.  A similar
heuristic is applied for small readings: values below the minimum
sensor range @i(R@-(min)) are usually glitches and are discarded.

@b(Averaging): Several independent readings from the same sensor at
the same position are averaged.  The sonar readings are subject to
error not only from reflections but also from other causes such as
fluctuations in the effective sensitivity of the transducer. As a
result readings show a certain dispersion.  Averaging narrows the
spread.

@b(Clustering:) A set of readings from one sensor at a given position
sometimes shows a clustering of the data around two different mean
values.  This happens when different readings are being originated by
objects at staggered distances. We apply a simple clustering analysis
to the data, and extract a mean value for each cluster for use in
subsequent processing.

@end(itemize)


@subsection(Representing the Sonar Beam)

Because of the wide beam angle the filtered data from the above
methods provides only indirect information about the location of the
detected objects.  We combine the constraints from individual readings
to reduce the uncertainty.  Our inferences are represented as
probabilities in a discrete grid.

A range reading is interpreted as providing information about space
volumes that are probably @c(EMPTY) and somewhere @c(OCCUPIED).  We
model the sonar beam by probability distribution
functions. Informally, these functions model our confidence that the
various points inside the cone of the beam are empty and our
uncertainty about the location of the point, somewhere on the range
surface of the cone, that caused the echo. The functions are based on
the range reading and on the spatial sensitivity pattern of the sonar.

Consider a
position @b(P) = (x,y,z) belonging to the volume swept by the sonar beam.
Let:

@begin(format)
	@i(R) be the range measurement returned by the sonar sensor, 
	@g(e) be the mean sonar deviation error, 
	@g(w) be the beam aperture, 
	@b(S) = (x@-(s), y@-(s), z@-(s)) be the position of the sonar sensor,
	@g(d) be the distance from @b(P) to @b(S), 
	@g(q) be the angle between the main axis of the beam and @b(SP).
@end(format)

We now identify two regions in the sonar beam:

@begin(itemize) @b(Empty Region:) Points inside the sonar beam (
@m{@g(d) @lt R - @g(e)} and
@m{@g(q) @lte @g(w)/2} ),  that have a probability @m{p@-(E) =
 f@-(E)(@g(d),@g(q))} of being empty. 

@b(Somewhere Occupied Region:) Points on the sonar beam front (
@m{@g(d) @in() [ R - @g(e) , R + @g(e) ]} and @m{@g(q) @lte @g(w)/2}
), that have a probability @m{p@-(O) = f@-(O)(@g(d),@g(q))} of being
occupied.

@end(itemize)

Fig. @ref(sonarprofiles) shows the probability profiles for a sonar
beam that returned a range reading @i(R).  The horizontal crossection
of the beam is associated with two probability distributions
corresponding to the empty and the occupied probabilities.

@begin(figure)

@presspicture(file = "sonarprofr.press", height = 5 in)

@caption[The Probability Profiles corresponding to the probably Empty
and somewhere Occupied regions in the sonar beam. The profiles
correspond to a horizontal crossection of the beam.]
@tag(sonarprofiles)
@end(figure)

The empty probability density function for a point @b(P) inside the sonar
beam is given by:

@begin(equation)
@tag(e1)@m{p@-(E)(x,y,z) =
  p[ @wr{position} (x,y,z) @wr{is} @wr{empty} ] =
  E@-(r)(@g(d)) : E@-(a)(@g(q))}
@end(equation)

where:

@begin(equation)
@tag(e2)@m{E@-(r)(@g(d)) =
  1 - ((@g(d) - R@-(min))/(R - @g(e) - R@-(min)))@up(2)}
    for @m{@g(d) @in() [R@-(min), R - @g(e)]}, and
@m{E@-(r)(@g(d)) = 0}   otherwise.
@end(equation)

And:

@begin(equation)
@tag(e3)@m{E@-(a)(@g(q)) =
 1 - (2@g(q)/@g(w))@up(2)} for @m{@g(q) @in() [-@g(w) / 2, @g(w) / 2]}
@end(equation)

The occupied probability density function for a point @b(P) on the
beam front is given by:

@begin(equation)
@tag(e4)@m{p@-(O)(x,y,z) =
  p[ @wr{position} (x,y,z) @wr{is} @wr{occupied} ] =
  O@-(r)(@g(d)) : O@-(a)(@g(q))}
@end(equation)

where:

@begin(comment)
@begin(equation)
@tag(e5)@mc{O@-(r)(@g(d)) =
  @brace(top=<1   @r(for) @g(d) @in() [R - @g(e), R + @g(e)]>,
      bot=<0   @r(otherwise)>)}
@end(equation)
@end(comment)

@begin(equation)
@tag(e5)@m{O@-(r)(@g(d)) =
  1 - ((@g(d) - R)/@g(e))@up(2)}
      for @m{@g(d) @in() [R - @g(e), R + @g(e)]}, and
@m{O@-(r)(@g(d)) = 0}   otherwise.
@end(equation)

And:

@begin(equation)
@tag(e6)@m{O@-(a)(@g(q)) =
   1 - (2@g(q)/@g(w))@up(2)} for @m{@g(q) @in() [-@g(w) / 2, @g(w) / 2]}
@end(equation)

These probability density functions are now projected on a horizontal
plane to generate map information.  We use the profiles that
correspond to a horizontal section of the sonar beam (@m{z = z@-(s)}).

@subsection(Representing Maps)

Sonar Maps are two-dimensional arrays of cells corresponding to a
horizontal grid imposed on the area to be mapped. The grid has @m{M *
N} cells, each of size @m{@g(D) * @g(D)}.  The final map has cell
values in the range (@m(-)1,1), where values less than 0 represent
probably empty regions, exactly zero represents unknown occupancy, and
greater than 0 implies a probably occupied space.  This map is
computed in a final step from two separate arrays analogous to the
empty and occupied probability distributions introduced above.

A cell is considered @c(UNKNOWN) if no information concerning it is
available.  Cells can be @c(EMPTY) with a degree of certainty or
confidence
 @m{Emp(@b(X), @b(Y))} and @c(OCCUPIED) with a degree of certainty
 @m{Occ(@b(X), @b(Y))} both values ranging from 0 to 1.  

The @i(a priori) empty and occupied certainty values for a given grid
cell @m{(@b(X), @b(Y))} and reading are determined by taking the
minimum of the reading's @m{p@-(E)} and maximum of @m{p@-(O)},
respectively, over the cell through a horizontal slice through the
beam center.

@subsection(Composing Information from Several Readings)

The map is built by projecting the beam probabilities onto the
discrete cells of the sonar map and there combining it with
information from other beams.  The position and the orientation of the
sonar sensor are used to register the beam with the map.

Different readings asserting that a cell is @c(EMPTY) will enhance
each other, as will readings implying that the cell may be
@c(OCCUPIED) while evidence that the cell is @c(EMPTY) will weaken the
certainty of it being @c(OCCUPIED) and vice-versa.

The operations performed on the empty and occupied probabilities are
not symmetrical. The probability distribution for @i(empty) areas
represents a solid volume whose totality is probably empty but the
@i(occupied) probability distribution for a single reading represents
a lack of knowledge we have about the location of a single reflecting
point somewhere in the range of the distribution. Empty regions are
simply added using a probabilistic addition formula. The @i(occupied)
probabilities for a single reading, on the other hand, are reduced in
the areas that the other data suggests is empty, then normalized to
make their sum unity. Only after this narrowing process are the
@i(occupied) probabilities from each reading combined using the
addition formula.

One range measurement contains only a small amount of information.  By
combining the evidence from many readings as the robot moves in its
environment, the area known to be empty is expanded. The number of
regions somewhere containing an occupied cell increases, while the
range of uncertainty in each such region decreases.  The overall
effect as more readings are added is a gradually increasing coverage
along with an increasing precision in the object locations.  Typically
after a few hundred readings (and less than a second of computer time)
our process is able to "condense out" a comprehensive map covering a
thousand square feet with better than one foot position accuracy of
the objects detected.  Note that such a result does not violate
information theoretic or degree of freedom constraints, since the
detected boundaries of objects are linear, not quadratic in the
dimensions of the map.  A thousand square foot map may contain only a
hundred linear feet of boundary.

Formally the evidence combination process proceeds along the following
steps:

@begin(enumerate)

@c(Reset:) Initially, the whole Map is reset to @c(UNKNOWN) by having 
@m{Emp(@b(X), @b(Y)) @r(:)= 0} and @m{Occ(@b(X), @b(Y)) @r(:)= 0}.

@begin(multiple)
@c(Superposition of empty areas:)
For every sonar reading @i(k) modify the emptyness information by:

@c(ENHANCE): @m{Emp(@b(X), @b(Y)) @r(:)=
  Emp(@b(X), @b(Y)) + Emp@-(k)(@b(X), @b(Y)) -
  Emp(@b(X), @b(Y)) * Emp@-(k)(@b(X), @b(Y))}
@end(multiple)

@begin(multiple)
@c(Superposition of occupied areas:) For each reading @i(k), shift the
occupied probabilites around in response to the combined emptyness map
using:

@c(CANCEL): @m{Occ@-(k)(@b(X), @b(Y)) @r(:)=
    Occ@-(k)(@b(X), @b(Y)) #:# (1 - Emp(@b(X), @b(Y)))}

@c(NORMALIZE): @m{Occ@-(k)(@b(X), @b(Y)) @r(:)=
    Occ@-(k)(@b(X), @b(Y)) / @sum() Occ@-(k)(@b(X), @b(Y))}

@c(ENHANCE): @m{Occ(@b(X), @b(Y)) @r(:)=
    Occ(@b(X), @b(Y)) + Occ@-(k)(@b(X), @b(Y)) -
    Occ(@b(X), @b(Y)) * Occ@-(k)(@b(X), @b(Y))}
@end(multiple)

@begin(multiple)
@c(Thresholding:) The final occupation value attributed to a cell is
given by a thresholding method:

@c(THRESHOLD: )@ Map (@b(X), @b(Y)) @r(:)=
  @brace(top=,
  bot=<- Emp(@b(X), @b(Y)) @wr(if) Occ(@b(X), @b(Y)) @lt Emp(@b(X),
  @b(Y))>)
@end(multiple)

@end(enumerate)


@subsection(Maps)

A typical map obtained through this method is shown in
Fig. @ref(2DMap), and the corresponding certainty factor distributions
are shown in Figs.  @ref(3DOccupied) and @ref(3DEmpty).  These are the
maps obtained before the thresholding step.

@comment[Fig. 1: A 2-D map]
@begin(fullpagefigure)  
@comment{With this ridiculous mathlm font, bar() doesn't work

@presspicture(file = "map2d.press",
		height = 3.5 in)

@caption[A Two-Dimensional Sonar Map. @i(Empty) areas with a high
certainty factor are represented by white areas; lower certainty
factors by "@m{@add}" symbols of increasing thickness. @i(Occupied)
areas are represented by "@m{@mult}" symbols, and @i(Unknown) areas by
"@m{:}" . The position of the robot is shown by a circle and the
outline of the room and of the major objects by a solid line. ]

@tag(2DMap)
@comment{

@end(fullpagefigure)

@begin(fullpagefigure)
@comment[Fig. 2: Occupied areas (3D)]
@comment{

@presspicture(
	file = "occ3d.press",	height = 4.6 in)

@caption[The Occupied Areas in the Sonar Map. This 3-D view shows the
Certainty Factors @i(Occ)(@i(@b(X) ,@b(Y))).]
@tag(3DOccupied)
@comment{

@end(fullpagefigure)

@begin(fullpagefigure)
@comment[Fig. 3: Empty areas (3D)]
@comment{

@presspicture(
	file = "emp3drev.press",height = 5.7 in)

@caption[The Empty Areas in the Sonar Map. This 3-D view shows the 
Certainty Factors @i(Emp)(@i(@b(X),@b(Y))).]
@tag(3DEmpty)
@comment{

@end(fullpagefigure)

The final maps obtained after thresholding are shown in Figs.
@ref(2DMapThreshold), @ref(3DOccupiedThreshold) and
@ref(3DEmptyThreshold).

@comment[Fig. : A 2-D thresholded map]
@begin(fullpagefigure)  
@comment{With this ridiculous mathlm font, bar() doesn't work

@presspicture(file = "map2dthr.press",height = 3.5 in)

@caption(The Two-Dimensional Sonar Map After Thresholding.)

@tag(2DMapThreshold)
@comment{

@end(fullpagefigure)

@begin(fullpagefigure)
@comment[Fig. 2: Thresholded Occupied areas (3D)]
@comment{

@presspicture(
	file = "occ3dthr.press",height = 4.6 in)

@caption(The Occupied Areas in the Sonar Map After Thresholding.)
@tag(3DOccupiedThreshold)
@comment{

@end(fullpagefigure)

@begin(fullpagefigure)
@comment[Fig. 3: Empty thresholded areas (3D)]
@comment{

@presspicture(
	file = "emp3drevthr.press",	height = 5.7 in)

@caption(The Empty Areas in the Sonar Map After Thresholding.)
@tag(3DEmptyThreshold)
@comment{

@end(fullpagefigure)



@section(Matching)

	Sonar navigation would benefit from a procedure that can match
two maps and report the displacement and rotation that best takes one
into the other.

	Our most successful programs begin with the thresholded maps
described above, with cell values that are negative if the cell is
empty, positive if occupied and zero if unknown.

	A measure of the goodness of the match between two maps at a
trial displacement and rotation is found by computing the sum of
products of corresponding cells in the two maps.  An occupied cell
falling on an occupied cell contributes a positive increment to the
sum, as does an empty cell falling on an empty cell (the product of
two negatives). An empty cell falling on an occupied one reduces the
sum, and any comparison involving an unknown value causes neither an
increase nor a decrease.  This naive approach is very slow. Applied
to maps with a linear dimension of @i(n), each trial position requires
@p(O)(@i(n@+(2))) multiplications. Each search dimension (two axes of
displacement and one of rotation) requires @p(O)(@i(n)) trial
positions. The total cost of the approach thus grows as
@p(O)(@i(n)@+(5)). With a typical @i(n) of 50 this approach can burn
up a good fraction of an hour of Vax time.

Considerable savings come from the observation that most of the
information in the maps is in the occupied cells alone.  Typically
only @p(O)(@i(n)) cells in the map, corresponding to wall and object
boundaries, are labelled occupied. A revised matching procedure
compares maps A and B through trial transformation @i(T) (represented
by a 2x2 rotation matrix and a 2 element displacement vector) by
enumerating the occupied cells of A, transforming the co-ordinates of
each such cell through @i(T) to find a corresponding cell in B. The
[A, B] pairs obtained this way are multiplied and summed, as in the
original procedure.  The occupied cells in B are enumerated and
multiplied with corresponding cells in A, found by transforming the B
co-ordinates through @i(T)@+(@ @m(-1)) (the inverse function of
@i(T)), and these products are also added to the sum. The result is
normalized by dividing by the total number of terms.  This procedure
is implemented efficiently by preprocessing each sonar map to give
both a raster representation and a linear list of the co-ordinates of
occupied cells. The cost growns as @p(O)(@i(n)@+(4)), and the typical
Vax running time is down to a few minutes.

	A further speedup is achieved by generating a hierarchy of
reduced resolution versions of each map.  A coarser map is produced
from a finer one by converting two by two subarrays of cells in the
original into single cells of the reduction. Our existing programs
assign the maximum value found in the subarray as the value of the
result cell, thus preserving occupied cells.  If the original array
has dimension @i(n), the first reduction is of size @i(n)/2, the
second of @i(n)/4 and so on. A list of occupied cell locations is
produced for each reduction level so that the matching method of the
previous paragraph can be applied. The maximum number of reduction
levels is @b(log)@-(2)@i(n). A match found at one level can be refined
at the next finer level by trying only about three values of each of
the two translational and one rotational parameters, in the vicinity
of the values found at the coarser level, for a total of 27
trials. With a moderate a-priori constraint on the transformation this
amount of search is adequate even at the first (coarsest) level.
Since the cost of a trial evaluation is proportional to the dimension
of the map, the coarse matches are inexpensive in any case.  Applied
to its fullest, this method brings the matching cost down to slightly
larger than @p(O)(@i(n)), and typical Vax times to under a second.

	We found one further preprocessing step is required to make
the matching process work in practice. Raw maps at standard
resolutions (6 inch cells) produced from moderate numbers (about 100)
of sonar measurements have narrow bands of cells labelled
@i(occupied).  In separately generated maps of the same area the
relative positions of these narrow bands shifts by as much as several
pixels, making good registration of the occupied areas of the two maps
impossible.  This can be explained by saying that the high spatial
frequency component of the position of the bands is noise, only the
lower frequencies carry information.  The problem is fixed by
filtering (blurring) the occupied cells to remove the high frequency
noise.  Experiment suggests that a map made from 100 readings should
be blurred with a spread of about two feet, while for map made from
200 readings a one foot smear is adequate.  Blurring increases the
number of cells labelled @i(occupied). So as not to increase the
computational cost from this effect, only the final raster version of
the map is blurred. The occupied cell list used in the matching
process is still made from the unfiltered raster.

With the process outlined here, maps with about 3000 six inch cells
made from 200 well spaced readings can be matched with an accuracy of
about six inches displacement and three degrees rotation in one second
of Vax time.

@section(Results)

We incorporated the sonar map builder into a system that successfully
navigates the Neptune robot through cluttered obstacle courses.  The
existing program incrementally builds a single sonar map by combining
the readings from successive vehicle stops made about one meter apart.
Navigation is by dead reckoning - we do not yet use the sonar map
matching code. The next move is planned in the most recent version of
the map by a path-planning method based on path relaxation
@cite(thorpe84).  Since this method can cope with a probabilistic
representation of occupied and empty areas and does path-planning in a
grid, it fits naturally into our present framework. The system has
successfully driven Neptune the length of our cluttered 30 by 15 foot
laboratory using less than one minute of computer time.

@section(Conclusions)

We have described a program that builds moderately high resolution
spatial maps of a mobile robot's surroundings by combining several
hundred range readings from unadorned Polaroid ultrasonic units.  The
main innovation is an efficient mathematical method that reduces the
position uncertainty of objects detected by wide angle sonar beams by
combining interlocking constraints in a raster occupation probability
map. We have also developed a fast algorithm for relating two maps of
the same area to derive relative displacement, angle and goodness of
match.

We have used this mapping method in a system that navigates a mobile
robot to a desired destination through obstacles and clutter, and are
preparing a more elaborate navigation system that depends on matching
of the sonar maps to recognize key locations and on higher-level
representations to navigate over long distances.

@section(Acknowledgments)

The authors would like to thank Gregg W. Podnar for his help in using
the Neptune robot and performing some of the experiments, Michael
Fuhrman for precious advice on the use of a 3D drawing package, and
Larry Matthies for providing the means to print the corresponding
drawings.