An effective object recognition scheme is to represent and match images on the basis of histograms derived from photometric color invariants. A drawback, however, is that certain color invariant values become very unstable in the presence of sensor noise. To suppress the effect of noise for unstable color invariant values, in this paper, histograms are computed by variable kernel density estimators. To apply variable kernel density estimation in a principled way, models are proposed for the propagation of sensor noise through color invariant variables. As a result, the associated uncertainty is obtained for each color invariant value. The associated uncertainty is used to derive the parameterization of the variable kernel for the purpose of robust histogram construction. It is empirically verified that the proposed density estimator compares favorably to traditional histogram schemes for the purpose of object recognition.
Abstract-The choice of a color model is of great importance for many computer vision algorithms (e.g., feature detection, object recognition, and tracking) as the chosen color model induces the equivalence classes to the actual algorithms. As there are many color models available, the inherent difficulty is how to automatically select a single color model or, alternatively, a weighted subset of color models producing the best result for a particular task. The subsequent hurdle is how to obtain a proper fusion scheme for the algorithms so that the results are combined in an optimal setting. To achieve proper color model selection and fusion of feature detection algorithms, in this paper, we propose a method that exploits nonperfect correlation between color models or feature detection algorithms derived from the principles of diversification. As a consequence, a proper balance is obtained between repeatability and distinctiveness. The result is a weighting scheme which yields maximal feature discrimination. The method is verified experimentally for three different image feature detectors. The experimental results show that the fusion method provides feature detection results having a higher discriminative power than the standard weighting scheme. Further, it is experimentally shown that the color model selection scheme provides a proper balance between color invariance (repeatability) and discriminative power (distinctiveness).
We aim at using color information to classify the physical nature of edges in video. To achieve physics-based edge classification, we first propose a novel approach to color edge detection by automatic noise-adaptive thresholding derived from sensor noise analysis. Then, we present a taxonomy on color edge types. As a result, a parameter-free edge classifier is obtained labeling color transitions into one of the following types: 1) shadow-geometry, 2) highlight edges, and 3) material edges. The proposed method is empirically verified on images showing complex real world scenes.
Intensity-based edge detectors cannot distinguish whether an edge is caused by material changes, shadows, surface orientation changes or by highlights. Therefore, our aim is to classify the physical cause of an edge using hyperspectra obtained by a spectrograph. Methods are presented to detect edges in hyperspectral images. In theory, the effect of varying imaging conditions is analyzed for "raw" hyper-spectra, for normalized hyper-spectra, and for hue computed from hyper-spectra. From this analysis, an edge classifier is derived which distinguishes hyper-spectral edges into the following types: (1) a shadow or geometry edge, (2) a highlight edge, (3) a material edge.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.