This work presents a novel technique of biometric identification based on the temporal history templates (THTs) of visual hand movements. The technique uses view-based approach for representation of hand movements, and uses a cumulative image-difference technique where the time between the sequences of images is implicitly captured in the representation of action. The low level representation of the action collapses the temporal structure of the motion from the video sequences of the hand movements while removing any static content from the video sequences to generate temporal history templates (THTs) of the hand movement. THTs of different individuals present distinctive 2-D motion patterns, where each pixel describes the function of temporal history of motion in that sequence. This THT are further subdivided into four sub-images an average and three detailed images using multi resolution wavelet transforms. The approximate wavelet sub-image is considered as the feature for recognition. The recognition criterion is established using KNN nearest neighbor technique using Mahalanobis distance. The accuracy of accepting an enrolled subject (AAES %) and accuracy of rejecting an imposter (ARI %) are the indicators of identification performance of the technique. The experimental results from 5-different individual indicate that the THT based technique achieves high identification rate when subject specific movements are assigned to the subjects during enrolment.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.