Among high school students, few are willing to perform automated external defibrillation. Willingness to perform MMR and CC appears to depend on the circumstances.
Because of greater efficiency relative to conventional methods, interest has developed for using vegetation indices in soybean [Glycine max (L.) Merr.] for identifying areas in a field experiencing injury by defoliating insects. Vegetation indices can indicate leaf area index (LAI) and light interception levels, canopy parameters affected by defoliating insects. Our objectives were to determine the relative accuracy of three vegetation indices for predicting LAI and light interception, and to outline a method for using vegetation indices for identifying areas in a field experiencing insect injury. Several commercial soybean cultivars were planted on a Commerce silt loam soil (fine‐silty, mixed, nonacid, thermic Aeric Fluvaquent) near Baton Rouge, LA (USA) (30° N lat) in May 2004 and June 2005. In 2004, differences in LAI and light interception were created by manual defoliation, whereas in 2005, LAI/light interception differences occurred because of cultivars and planting dates. Results indicated that across canopies ranging from very low LAI to canopy closure (95% light interception), the normalized difference vegetation index (NDVI) most accurately predicted LAI and light interception (r2 = 0.93–0.97). Light interception and LAI were linked to NDVI by strong linear regression models, and did not show the quadratic response reported by others. A proposed method for adopting NDVI to identify insect‐infested areas is presented.
Bird predation is one of the major concerns for fish culture in open ponds. A novel method for dispersing birds is the use of autonomous vehicles. Image recognition software can improve their efficiency. Several image processing techniques for recognition of birds have been tested. A series of morphological operations were implemented. We divided images into 3 types, Type 1, Type 2, and Type 3, based on the level of difficulty of recognizing birds. Type 1 images were clear; Type 2 images were medium clear, and Type 3 images were unclear. Local thresholding has been implemented using HSV (Hue, Saturation, and Value), GRAY, and RGB (Red, Green, and Blue) color models on all three sections of images and results were tabulated. Template matching using normal correlation and artificial neural networks (ANN) are the other methods that have been developed in this study in addition to image morphology. Template matching produced satisfactory results irrespective of the difficulty level of images, but artificial neural networks produced accuracies of 100, 60, and 50% on Type 1, Type 2, and Type 3 images, respectively. Correct classification rate can be increased by further training. Future research will focus on testing the recognition algorithms in natural or aquacultural settings on autonomous boats. Applications of such techniques to industrial, agricultural, or related areas are additional future possibilities.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.