NASA's two Mars Exploration Rovers ͑MER͒ have successfully demonstrated a robotic Visual Odometry capability on another world for the first time. This provides each rover with accurate knowledge of its position, allowing it to autonomously detect and compensate for any unforeseen slip encountered during a drive. It has enabled the rovers to drive safely and more effectively in highly sloped and sandy terrains and has resulted in increased mission science return by reducing the number of days required to drive into interesting areas. The MER Visual Odometry system comprises onboard software for comparing stereo pairs taken by the pointable mast-mounted 45 deg FOV Navigation cameras ͑NAVCAMs͒. The system computes an update to the 6 degree of freedom rover pose ͑x, y, z, roll, pitch, yaw͒ by tracking the motion of autonomously selected terrain features between two pairs of 256ϫ 256 stereo images. It has demonstrated good performance with high rates of successful convergence ͑97% on Spirit, 95% on Opportunity͒, successfully detected slip ratios as high as 125%, and measured changes as small as 2 mm, even while driving on slopes as high as 31 deg. Visual Odometry was used over 14% of the first 10.7 km driven by both rovers. During the first 2 years of operations, Visual Odometry evolved from an "extra credit" capability into a critical vehicle safety system. In this paper we describe our Visual Odometry algorithm, discuss several driving strategies that rely on it ͑including Slip Checks, Keep-out Zones, and Wheel Dragging͒, and summarize its results from the first 2 years of operations on Mars.
NASA's Mars Exploration Rover (MER) missions will land twin rovers on the surface of Mars in 2004. These rovers will have the ability to navigate safely through unknown and potentially hazardous terrain, using autonomous passive stereo vision to detect potential terrain hazards before driving into them. Unfortunately, the computational power of currently available radiation hardened processors limits the amount of distance (and therefore science) that can be safely achieved by any rover in a given time frame.
[1] NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45°square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front-and rear-facing set of stereo pairs, each with a 124°square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45°square FOV and will return images with spatial resolutions of $4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 Â 1024 pixel detectors.
The imager will offer alternative ways of studying earthquakes and improve early warning systems.
rover traveled across regolith-covered, rock-strewn plains that transitioned into terrains that have been variably eroded, with valleys partially filled with windblown sands, and intervening plateaus capped by well-cemented sandstones that have been fractured and shaped by wind into outcrops with numerous sharp rock surfaces. Wheel punctures and tears caused by sharp rocks while traversing the plateaus led to directing the rover to traverse in valleys where sands would cushion wheel loads. This required driving across a megaripple (windblown, sand-sized deposit covered by coarser grains) that straddles a narrow gap and several extensive megaripple deposits that accumulated in low portions of valleys. Traverses across megaripple deposits led to mobility difficulties, with sinkage values up to approximately 30% of the 0.50 m wheel diameter, resultant high compaction resistances, and rover-based slip up to 77%. Analysis of imaging and engineering data collected during traverses across megaripples for the first 710 sols (Mars days) of the mission, laboratory-based single-wheel soil experiments, full-scale rover tests at the Dumont Dunes, Mojave Desert, California, and numerical simulations show that a combination of material properties and megaripple geometries explain the high wheel sinkage and slip events. Extensive megaripple deposits have subsequently been avoided and instead traverses have been implemented across terrains covered with regolith or thin windblown sand covers and megaripples separated by bedrock exposures. C 2016 Wiley Periodicals, Inc.
Increasing the level of spacecraft autonomy is essential for broadening the reach of solar system exploration. Computer vision has and will continue to play an important role in increasing autonomy of both spacecraft and Earthbased robotic vehicles. This article addresses progress on computer vision for planetary rovers and landers and has four main parts. First, we review major milestones in the development of computer vision for robotic vehicles over the last four decades. Since research on applications for Earth and space has often been closely intertwined, the review includes elements of both. Second, we summarize the design and performance of computer vision algorithms used on Mars in the NASA/JPL Mars Exploration Rover (MER) mission, which was a major step forward in the use of computer vision in space. These algorithms did stereo vision and visual odometry for rover navigation and feature tracking for horizontal velocity estimation for the landers. Third, we summarize ongoing research to improve vision systems for planetary rovers, which includes various aspects of noise reduction, FPGA implementation, and vision-based slip perception. Finally, we briefly survey other opportunities for computer vision to impact rovers, landers, and orbiters in future solar system exploration missions.
This paper presents the initial results of lander and rover localization and topographic mapping of the MER 2003 mission (by Sol 225 for Spirit and Sol 206 for Opportunity). The Spirit rover has traversed a distance of 3.2 km (actual distance traveled instead of odometry) and Opportunity at 1.2 km. We localized the landers in the Gusev Crater and on the Meridiani Planum using two-way Doppler radio positioning technology and cartographic triangulations through landmarks visible in both orbital and ground images. Additional high-resolution orbital images were taken to verify the determined lander positions. Visual odometry and bundleadjustment technologies were applied to overcome wheel slippages, azimuthal angle drift and other navigation errors (as large as 21 percent). We generated timely topographic products including 68 orthophoto maps and 3D Digital Terrain Models, eight horizontal rover traverse maps, vertical traverse profiles up to Sol 214 for Spirit and Sol 62 for
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.