Expedition Demonstration: Robotic Classification of Rocks

The agenda for the December 1998 Antarctic field season is to test the meteorite classifier under true field conditions: that is using data acquired from new rocks by the robot mounted sensors, and with limited human input in the data collection process. This will establish conclusively whether the current system (sensors and classifier) is practical for field robotic deployment, and its suitability for robotic meteorite search.
The Nomad robot will be driven through the areas of Independence Moraine from where the data and rock samples were acquired last season. Images and spectral data from new samples will be collected, and fed to the classifier for assessment. Samples will be taken from each rock so examined, and the classifier output compared to that from a field geologist who will be with us.

In its meteorite search mode, Nomad will be given way-points or be allowed to wander aimlessly through an area. The user will monitor the panoramic images on the operator interface and if he feels a certain area is interesting he can tell the robot to go there. Once closer to the interest region, the operator will select a particular rock and enter it into the database. If necessary the operator can command the robot to close even further with the object, using strict teleoperation or through commands to the maneuver planner. Once in position, the user can command a high resolution image of the object. Using a radio the remote user can communicate with the sensor operator to place the spectrometer or magnetometer on the rock of interest. The user then issues a command to the appropriate sensor to take a sample. When finished the robot can resume wandering or go to another way-point at the userís discretion. Data processing and classification analysis will be performed in real-time on board Nomad.
A Bayes network based rock/meteorite classifier will be implemented from the visual imagery and reflectance spectra of rocks obtained in the field. The classifier is capable of identifying and discriminating between the various local rocks (mostly marbles, limestones, granites, quartzites and some igneous rocks) as well as meteorites and subclasses thereof (Achondrites, Chondrites and iron meteorites).

Thus far, the best results have been using spectral data from the visible to the infrared portions of the light spectrum (350nm - 2500nm), in spite of significant variations due to cloud cover and other atmospheric variations that affect the quality of the spectra. If confined to the visible wavelengths only (400nm - 900nm), approximately 90% of rock samples and 90% of meteorites are correctly classified as either rocks or meteorites based on field results from last expedition. For the task of identifying rock types, approximately 5% of rocks are misclassified using spectral data, with limestones and marbles been the most likely to be confused. Since they both consist of the same minerals this is not surprising.

The Patriot and Independence Hills regions of Antarctica are not believed to harbor meteorites. Therefore, to test the classifier utility for meteorite search an area will have to be seeded with meteorites. A geologist looking at high resolution images returned by the rover will designate possible candidates, that the robot classifier will then tackle. Success will be the robot correctly classifying 90% of the meteorites, and rejecting 95% of all rocks. In addition, in the robot will be used to gather more data for use in further perfecting the classifier. An expedition further inland from Patriot Hills to the polar plateau to search for meteorites will be an opportunity to acquire more data from different geographical areas, particularly ones where meteorites may be found. A hand carried version of the spectrometer on Nomad and a digital camera will be used for this.

This season, a human operator will control the prototype meteorite search system. Therefore, a user interface has been created that allows control and testing of nearly every component in the science system. Although it can track the status of most science system components, the final implementation of the user interface deals most directly with three systems. First, the user can control the mission planner by specifying a robot task mode: go to a way-point, execute a coverage pattern, or take sensor measurements. The user can also control the details of how the mission planner will coordinate the science system, such as setting way-point tolerance, robot field of view, and coverage pattern type. The user interface also allows access to the database. While the interface does not require the ability to update data into the database, the user needs to be able to look up information in the database such as target images or classifier probabilities. Finally, the user must control the pan/tilt camera to select new rock targets. This is one of the most critical functions of the interface. It has the ability to pan, tilt, and focus the camera until an acceptable initial, or "template", image is taken. The user clicks on the target in the template image, and the pan/tilt software inserts this data into the database. Then, the user is prompted to select a hi-resolution image of the same target. The user can zoom well into the target to get a close-up image of it. This image, along with other information, is then inserted into the database for the classifier to use.

 Back to Expedition Objectives

Robotic Search for Antarctic Meteorites 1998
All material on this page is property of NASA and Carnegie Mellon University. Any image or text
taken from this site and incorporated into another document without consent violates the Copyright
Law of the United States and the Berne International Copyright Agreement.
Send comments, questions, or suggestions to Dimitrios Apostolopoulos.
This document prepared by Michael Wagner.