This paper presents an online dynamic hand gesture recognition system with an RGB-D camera, which can automatically recognize hand gestures against complicated background. For background subtraction, we use a model-based method to perform human detection and segmentation in the depth map. Since a robust hand tracking approach is crucial for the performance of hand gesture recognition, our system uses both color information and depth information in the process of hand tracking. To extract spatiotemporal hand gesture sequences in the trajectory, a reliable gesture spotting scheme with detection on change of static postures is proposed. Then discrete HMMs with Left-Right Banded (LRB) topology are The work described in this paper is partially supported by the utilized to model and classify gestures based on multifeature representation and quantization of the hand gesture sequences. Experimental evaluations on two self-built databases of dynamic hand gestures show the effectiveness of the proposed system. Furthermore, we develop a human-robot interactive system, and the performance of this system is demonstrated through interactive experiments in the dynamic environment.
Natural human robot interaction based on the dynamic hand gesture is becoming a popular research topic in the past few years. The traditional dynamic gesture recognition methods are usually restricted by the factors of illumination condition, varying color and cluttered background. The recognition performance can be improved by using the hand-wearing devices but this is not a natural and barrier-free interaction. To overcome these shortcomings, the depth perception algorithm based on the Kinect depth sensor is introduced to carry out 3D hand tracking. We propose a novel start/end point detection method for segmenting the 3D hand gesture from the hand motion trajectory. Then Hidden Markov Models (HMMs) are implemented to model and classify the hand gesture sequences and the recognized gestures are converted to control commands for the interaction with the robot. Seven different hand gestures performed by two hands can sufficiently navigate the robot. Experiments show that the proposed dynamic hand gesture interaction system can work effectively in the complex environment and in real-time with an average recognition rate of 98.4%. And further experiments for the robot navigation also verify the robustness of our system.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.