Abstract. Terrain classification is a fundamental task in outdoor robot navigation to detect and avoid impassable terrain. Camera-based approaches are wellstudied and provide good results. A drawback of these approaches, however, is that the quality of the classification varies with the prevailing lighting conditions. 3D laser scanners, on the other hand, are largely illumination-invariant. In this work we present easy to compute features for 3D point clouds using range and intensity values. We compare the classification results obtained using only the laser-based features with the results of camera-based classification and study the influence of different lighting conditions.
Abstract-In this paper we present a comparison of multiple approaches to visual terrain classification for outdoor mobile robots based on local features. We compare the more traditional texture classification approaches, such as Local Binary Patterns, Local Ternary Patterns and a newer extension Local Adaptive Ternary Patterns, and also modify and test three non-traditional approaches called SURF, DAISY and CCH. We drove our robot under different weather and ground conditions and captured images of five different terrain types for our experiments. We did not filter out blurred images which are due to robot motion and other artifacts caused by rain, etc. We used Random Forests for classification, and cross-validation for the verification of our results. The results show that most of the approaches work well for terrain classification in a fast moving mobile robot, despite image blur and other artifacts induced due to extremely variant weather conditions.
Abstract-In the domain of agricultural robotics, one major application is crop scouting, e.g., for the task of weed control. For this task a key enabler is a robust detection and classification of the plant and species. Automatically distinguishing between plant species is a challenging task, because some species look very similar. It is also difficult to translate the symbolic high level description of the appearances and the differences between the plants used by humans, into a formal, computer understandable form. Also it is not possible to reliably detect structures, like leaves and branches in 3D data provided by our sensor. One approach to solve this problem is to learn how to classify the species by using a set of example plants and machine learning methods.In this paper we are introducing a method for distinguishing plant species using a 3D LIDAR sensor and supervised learning. For that we have developed a set of size and rotation invariant features and evaluated experimentally which are the most descriptive ones. Besides these features we have also compared different learning methods using the toolbox Weka. It turned out that the best methods for our application are simple logistic regression functions, support vector machines and neural networks. In our experiments we used six different plant species, typically available at common nurseries, and about 20 examples of each species. In the laboratory we were able to identify over 98% of these plants correctly.
This paper addresses the task of calibrating the kinematic parameters and odometry of car-like robots with dualaxis steering. To achieve this goal only the robots builtin laser rangers and no external tracking systems are employed. We introduce a method to actively calibrate the steering angles of both front and rear steering angles with a multi-input multi-output (MIMO) controller. Using the determined function between steering servo input and steering angle the effective wheelbase and wheel diameters are estimated. We present an automated selfcalibration procedure for car-like robots with dual-axis steering. The results are verified using our self-developed outdoor robot platform.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.