The underwater docking of autonomous underwater vehicles (AUVs) is conducive to energy supply and data exchange. A vision-based high-precision estimation of “the positions and poses of an AUV relative to a docking station” (PPARD) is a necessary condition for successful docking. Classical binarization methods have a low success rate in extracting guidance features from fuzzy underwater images, resulting in an insufficient stability of the PPARD estimation. Based on the fact that guidance lamps are blue strong point light sources, this study proposes an adaptive calculation method of binary threshold for the guidance image. To decrease the failure of guidance feature extraction, a guidance image enhancement method is proposed to strengthen the characteristic that the guidance lamps are strong point light sources with a certain area. The PPARD is estimated through solving the minimum value of the imaging error function for the vision-based extracted guidance features. The experimental results showed that the absolute estimation error for each degree of freedom in the PPARD was at most 10%, which was lower than that of the orthogonal iteration (OI) method. In addition, the proposed guidance feature extraction method proved to be better than the classical methods, with the extraction success rate reaching 87.99%.
The detection of wheel sinkage has great significance for rover mobility optimization control and prevention of wheel sinking. A new wheel sinkage detection method is proposed based on planetary rovers' wheel-soil boundary. The model of wheel sinkage calculation is built. The machine vision method is proposed to extract wheel-soil boundary. Wheel-soil interaction image is processed into binary image, and the wheel-soil boundary is extracted according to its morphological features. The wheel sinkage depth, entrance angle and departure angle are calculated through the model of wheel sinkage calculation. The method's applicability has been validated by experiments under various terrain conditions, which are flat terrain, raised terrain, hollow terrain, and rough terrain. Accuracy tests are done with the flat terrain, the results of the experiments indicate that the relative errors of the sinkage depth are around 10% and the relative errors of terrain interface angles are around 5%, when the actual sinkage depth is above 5mm.
It is important for Mars exploration rovers to achieve autonomous and safe mobility over rough terrain. Terrain classification can help rovers to select a safe terrain to traverse and avoid sinking and/or damaging the vehicle. Mars terrains are often classified using visual methods. However, the accuracy of terrain classification has been less than 90% in read operations. A high-accuracy vision-based method for Mars terrain classification is presented in this paper. By analyzing Mars terrain characteristics, novel image features, including multiscale gray gradient-grade features, multiscale edges strength-grade features, multiscale frequency-domain mean amplitude features, multiscale spectrum symmetry features, and multiscale spectrum amplitude-moment features, are proposed that are specifically targeted for terrain classification. Three classifiers, K-nearest neighbor (KNN), support vector machine (SVM), and random forests (RF), are adopted to classify the terrain using the proposed features. The Mars image dataset MSLNet that was collected by the Mars Science Laboratory (MSL, Curiosity) rover is used to conduct terrain classification experiments. The resolution of Mars images in the dataset is 256 × 256. Experimental results indicate that the RF classifies Mars terrain at the highest level of accuracy of 94.66%.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.