Our purpose is to extend the Local Binary Pattern method to three dimensions and compare it with the two-dimensional model for three-dimensional texture analysis. To compare these two methods, we made classification experiments using three databases of three-dimensional texture images having different properties. The first database is a set of three-dimensional images without any distorsion or transformation, the second contains additional gaussian noise. The last one contains similar textures as the first one but with random rotations according x, y and z axis. For each of these databases, the three-dimensional Local Binary Pattern method outperforms the two-dimensional approach which has more difficulties to provide correct classifications.
International audienceThis paper presents a multiresolution system for volumetric texture analysis. The originality of this system partially originates from its use of combinations of perceptual texture features that correspond to adjectives commonly used by humans to describe textures. To approximate these features, we use a combination of different families of texture analysis methods rather than a single texture analysis model. This choice is necessary to obtain a good perceptual feature approximation and allows our system to be robust and generic. Moreover, by using our human-understandable features (HUF), it is convenient for a user to manipulate and select the features that are, according to the user, relevant for a given application. Two experiments are presented: the first experiment demonstrates the strong correspondence between our features and a human's description of textures, and the second demonstrates the performance of our proposed method. Finally, the proposed HUF are integrated into an interactive segmen-tation system and are compared to previously proposed descriptors through analysis of several segmentation results of 3D ultrasound images
International audienceThis paper presents a new framework for an interactive segmentation of 3D images. The framework is based on a bimodal data structure defined by a Region Adjacency Graph (RAG) that is associated with a Hierarchical Classification Tree (HCT). The RAG provides information about the spatial and topological organisation of the extracted regions of the image. The HCT provides information about the similarities between the extracted regions of the image based on a predefined set of features. The first contribution of our work is the combination of a RAG and a HCT. An incremental system was obtained by defining operators that work with and on the RAG and the HCT. If a static predefined processing chain has been defined, these operators can be used in batch mode. If a scheduler is available, they can be used in an adaptive manner. Finally, if a user chooses the operator to be used after each step, the operators can be used interactively. The second contribution of this paper is the formal description of these operators. To give the user the ability to incrementally improve the segmentation, powerful visualisation of the segmentation state and interfaces have been proposed, an important advantage of the proposed framework. To validate the proposed framework, a user study has been conducted in a concrete case of texture segmentation. Our system obtains very satisfactory results even for complex volumetric textures, and helps real users by providing high quality segmentations. The system has been used by specialists in sonography to segment 3D ultrasound images of the skin. Some examples of segmentation are presented to illustrate the benefit of the interactivity provided by our approach
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.