Teams of mobile robots will play a crucial role in future missions to explore the surfaces of extraterrestrial bodies. Setting up infrastructure and taking scientific samples are expensive tasks when operating in distant, challenging, and unknown environments. In contrast to current single-robot space missions, future heterogeneous robotic teams will increase efficiency via enhanced autonomy and parallelization, improve robustness via functional redundancy, as well as benefit from complementary capabilities of the individual robots. In this article, we present our heterogeneous robotic team, consisting of flying and driving robots that we plan to deploy on scientific sampling demonstration missions at a Moon-analogue site on Mt. Etna, Sicily, Italy in 2021 as part of the ARCHES project. We describe the robots' individual capabilities and their roles in two mission scenarios. We then present components and experiments on important tasks therein: automated task planning, high-level mission control, spectral rock analysis, radio-based localization, collaborative multi-robot 6D SLAM in Moon-analogue and Marslike scenarios, and demonstrations of autonomous sample return.
Planetary rovers increasingly rely on vision‐based components for autonomous navigation and mapping. Developing and testing these components requires representative optical conditions, which can be achieved by either field testing at planetary analog sites on Earth or using prerecorded data sets from such locations. However, the availability of representative data is scarce and field testing in planetary analog sites requires a substantial financial investment and logistical overhead, and it entails the risk of damaging complex robotic systems. To address these issues, we use our compact human‐portable DLR Sensor Unit for Planetary Exploration Rovers (SUPER) in the Moroccan desert to show resource‐efficient field testing and make the resulting Morocco‐Acquired data set of Mars‐Analog eXploration (MADMAX) publicly accessible. The data set consists of 36 different navigation experiments, captured at eight Mars analog sites of widely varying environmental conditions. Its longest trajectory covers 1.5 km and the combined trajectory length is 9.2 km. The data set contains time‐stamped recordings from monochrome stereo cameras, a color camera, omnidirectional cameras in stereo configuration, and from an inertial measurement unit. Additionally, we provide the ground truth in position and orientation together with the associated uncertainties, obtained by a real‐time kinematic‐based algorithm that fuses the global navigation satellite system data of two body antennas. Finally, we run two state‐of‐the‐art navigation algorithms, ORB‐SLAM2 and VINS‐mono, on our data to evaluate their accuracy and to provide a baseline, which can be used as a performance reference of accuracy and robustness for other navigation algorithms. The data set can be accessed at https://rmc.dlr.de/morocco2018.
The here presented flying system uses two pairs of wide-angle stereo cameras and maps a large area of interest in a short amount of time. We present a multicopter system equipped with two pairs of wide-angle stereo cameras and an inertial measurement unit (IMU) for robust visual-inertial navigation and time-efficient omni-directional 3D mapping. The four cameras cover a 240 degree stereo field of view (FOV) vertically, which makes the system also suitable for cramped and confined environments like caves. In our approach, we synthesize eight virtual pinhole cameras from four wide-angle cameras. Each of the resulting four synthesized pinhole stereo systems provides input to an independent visual odometry (VO). Subsequently, the four individual motion estimates are fused with data from an IMU, based on their consistency with the state estimation. We describe the configuration and image processing of the vision system as well as the sensor fusion and mapping pipeline on board the MAV. We demonstrate the robustness of our multi-VO approach for visual-inertial navigation and present results of a 3D-mapping experiment.
We introduce a prototype flying platform for planetary exploration: autonomous robot design for extraterrestrial applications (ARDEA). Communication with unmanned missions beyond Earth orbit suffers from time delay, thus a key criterion for robotic exploration is a robot's ability to perform tasks without human intervention. For autonomous operation, all computations should be done on-board and GlobalNavigation Satellite System (GNSS) should not be relied on for navigation purposes.Given these objectives ARDEA is equipped with two pairs of wide-angle stereo cameras and an inertial measurement unit (IMU) for robust visual-inertial navigation and time-efficient, omni-directional 3D mapping. The four cameras cover a 240 ∘ vertical field of view, enabling the system to operate in confined environments such as caves formed by lava tubes. The captured images are split into several pinhole cameras, which are used for simultaneously running visual odometries. The stereo output is used for simultaneous localization and mapping, 3D map generation and collision-free motion planning. To operate the vehicle efficiently for a variety of missions, ARDEA's capabilities have been modularized into skills which can be assembled to fulfill a mission's objectives. These skills are defined generically so that they are independent of the robot configuration, making the approach suitable for different heterogeneous robotic teams. The diverse skill set also makes the micro aerial vehicle (MAV) useful for any task where autonomous exploration is needed.For example terrestrial search and rescue missions where visual navigation in GNSSdenied indoor environments is crucial, such as partially collapsed man-made structures like buildings or tunnels. We have demonstrated the robustness of our system in indoor and outdoor field tests. K E Y W O R D S aerial robotics, computer vision, exploration, GPS-denied operation, planetary robotics ---This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
The Earth's moon is currently an object of interest of many space agencies for unmanned robotic missions within this decade. Besides future prospects for building lunar gateways as support to human space flight, the Moon is an attractive location for scientific purposes. Not only will its study give insight on the foundations of the Solar System but also its location, uncontaminated by the Earth's ionosphere, represents a vantage point for the observation of the Sun and planetary bodies outside the Solar System. Lunar exploration has been traditionally conducted by means of single-agent robotic assets, which is a limiting factor for the return of scientific missions. The German Aerospace Center (DLR) is developing fundamental technologies towards increased autonomy of robotic explorers to fulfil more complex mission tasks through cooperation. This paper presents an overview of past, present and future activities of DLR towards highly autonomous systems for scientific missions targeting the Moon and other planetary bodies. The heritage from the Mobile Asteroid Scout (MASCOT), developed jointly by DLR and CNES and deployed on asteroid Ryugu on 3 October 2018 from JAXA's Hayabusa2 spacecraft, inspired the development of novel core technologies towards higher efficiency in planetary exploration. Together with the lessons learnt from the ROBEX project (2012–2017), where a mobile robot autonomously deployed seismic sensors at a Moon analogue site, this experience is shaping the future steps towards more complex space missions. They include the development of a mobile rover for JAXA's Martian Moons eXploration (MMX) in 2024 as well as demonstrations of novel multi-robot technologies at a Moon analogue site on the volcano Mt Etna in the ARCHES project. Within ARCHES, a demonstration mission is planned from the 14 June to 10 July 2021, 1 during which heterogeneous teams of robots will autonomously conduct geological and mineralogical analysis experiments and deploy an array of low-frequency antennas to measure Jovian and solar bursts. This article is part of a discussion meeting issue ‘Astronomy from the Moon: the next decades'.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.