Lately, 3D imaging techniques have achieved a lot of progress due to recent developments in 3D sensor technologies. This leads to a great interest regarding 3D image feature extraction and classification techniques. As pointed out in literature, one of the most important and discriminative features in images is the textural content. Within this context, we propose a texture feature extraction technique for volumetric images with improved discrimination power. The method could be used in textured volumetric data classification tasks. To achieve this, we fuse two complementary pieces of information, feature vectors derived from Local Binary Patterns (LBP) and the Gray-Level Co-occurrence Matrix-based methods. They provide information regarding the image pattern and the contrast, homogeneity and local anisotropy in the volumetric data, respectively. The performance of the proposed technique was evaluated on a public dataset consisting of volumetric textured images affected by several transformations. The classifiers used are the Support Vector Machine, k-Nearest Neighbours and Random Forest. Our method outperforms other handcrafted 3D or 2D texture feature extraction methods and typical deep-learning networks. The proposed technique improves the discrimination power and achieves promising results even if the number of images per class is relatively small.
This paper studies the use of deep-learning models (AlexNet, VggNet, ResNet) pre-trained on object categories (ImageNet) in applied texture classification problems such as plant disease detection tasks. Research related to precision agriculture is of high relevance due to its potential economic impact on agricultural productivity and quality. Within this context, we propose a deep learning-based feature extraction method for the identification of plant species and the classification of plant leaf diseases. We focus on results relevant to real-time processing scenarios that can be easily transferred to manned/unmanned agricultural smart machinery (e.g. tractors, drones, robots, IoT smart sensor networks, etc.) by reconsidering the common processing pipeline. In our approach, texture features are extracted from different layers of pre-trained Convolutional Neural Network models and are later applied to a machine-learning classifier. For the experimental evaluation, we used publicly available datasets consisting of RGB textured images and datasets containing images of healthy and non-healthy plant leaves of different species. We compared our method to feature vectors derived from traditional handcrafted feature extraction descriptors computed for the same images and end-to-end deep-learning approaches. The proposed method proves to be significantly more efficient in terms of processing times and discriminative power, being able to surpass traditional and end-toend CNN-based methods and provide a solution also to the problem of the reduced datasets available for precision agriculture.
translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.