This paper develops an approach to measure the information content of a biometric feature representation. We define biometric information as the decrease in uncertainty about the identity of a person due to a set of biometric measurements. We then show that the biometric feature information for a person may be calculated by the relative entropy D(p q) between the population feature distribution q and the person's feature distribution p. The biometric information for a system is the mean D(p q) for all persons in the population. In order to practically measure D(p q) with limited data samples, we introduce an algorithm which regularizes a Gaussian model of the feature covariances. An example of this method is shown for PCA and Fisher linear discriminant (FLD) based face recognition, with biometric feature information calculated to be 45.0 bits (PCA), 37.0 bits (FLD) and 55.6 bits (fusion of PCA and FLD features). Finally, we discuss general applications of this measure.
This paper develops a new approach to understand and measure variations in biometric sample quality. We begin with the intuition that degradations to a biometric sample will reduce the amount of identi able information available. In order to measure the amount of identi able information, we
Abstract. We ask: how many bits of information (in the Shannon sense) do we get from a set of EIT measurements? Here, the term information in measurements (IM) is defined as: the decrease in uncertainty about the contents of a medium, due to a set of measurements. This decrease in uncertainly is quantified by the change from the the inter-class model, q, defined by the prior information, to the intra-class model, p, given by the measured data (corrupted by noise). IM is measured by the expected relative entropy (Kullback-Leibler divergence) between distributions q and p, and corresponds to the channel capacity in an analogous communications system. Based on a Gaussian model of the measurement noise, Σ n , and a prior model of the image element covariances Σ x , we calculate IM= is the signal to noise ratio for each independent measurement calculated from the prior and noise models. For an example, we consider saline tank measurements from a 16 electrode EIT system, with a 2 cm radius non-conductive target, and calculate IM= 179 bits. Temporal sequences of frames are considered, and formulae for IM as a function of temporal image element correlations are derived. We suggest that this measure may allow novel insights into questions such as distinguishability limits, optimal measurement schemes and data fusion.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.