Monitoring the structure parameters and damage to trees plays an important role in forest management. Remote-sensing data collected by an unmanned aerial vehicle (UAV) provides valuable resources to improve the efficiency of decision making. In this work, we propose an approach to enhance algorithms for species classification and assessment of the vital status of forest stands by using automated individual tree crowns delineation (ITCD). The approach can be potentially used for inventory and identifying the health status of trees in regional-scale forest areas. The proposed ITCD algorithm goes through three stages: preprocessing (contrast enhancement), crown segmentation based on wavelet transformation and morphological operations, and boundaries detection. The performance of the ITCD algorithm was demonstrated for different test plots containing homogeneous and complex structured forest stands. For typical scenes, the crown contouring accuracy is about 95%. The pixel-by-pixel classification is based on the ensemble supervised classification method error correcting output codes with the Gaussian kernel support vector machine chosen as a binary learner. We demonstrated that pixel-by-pixel species classification of multi-spectral images can be performed with a total error of about 1%, which is significantly less than by processing RGB images. The advantage of the proposed approach lies in the combined processing of multispectral and RGB photo images.
In recent years, massive outbreaks of the European spruce bark beetle (Ips typographus, (L.)) have caused colossal harm to coniferous forests. The main solution for this problem is the timely prevention of the bark beetle spread, for which it is necessary to identify damaged trees in their early stages of infestation. Fortunately, high-resolution unmanned aerial vehicle (UAV) imagery together with modern detection models provide a high potential for addressing such issues. In this work, we evaluate and compare three You Only Look Once (YOLO) deep neural network architectures, namely YOLOv2, YOLOv3, and YOLOv4, in the task of detecting infested trees in UAV images. We built a new dataset for training and testing these models and used a pre-processing balance contrast enhancement technique (BCET) that improves the generalization capacity of the models. Our experiments show that YOLOv4 achieves particularly good results when applying the BCET pre-processing. The best test result when comparing YOLO models was obtained for YOLOv4 with the mean average precision up to 95%. As a result of applying artificial data augmentation, the improvement for models YOLOv2, YOLOv3, and YOLOv4 was 65.0%, 7.22%, and 3.19%, respectively.INDEX TERMS Norway spruce, bark beetle, unmanned aerial vehicle (UAV), you only look once (YOLO), object detection.
The identification, segmentation, and detection of the infected area in brain tumor is a tedious and a time-consuming task. The different structures of the human body can be visualized by an image processing concept, an MRI. It is very difficult to visualize abnormal structures of the human brain using simple imaging techniques. An MRI technique contains many imaging modalities that scan and capture the internal structure of the human brain. This article concentrates on a noise removal technique, followed by improvement of medical images for a correct diagnosis using a balance contrast enhancement technique (BCET). Then, image segmentation is used. Finally, the Canny edge detection method is applied to detect the fine edges. The experiment results achieved nearly 98% accuracy in detecting the area of the tumor and normal brain regions in MRI images demonstrating the effectiveness of the proposed technique.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.