Marine megafauna are difficult to observe and count because many species travel widely and spend large amounts of time submerged. As such, management programmes seeking to conserve these species are often hampered by limited information about population levels. Unoccupied aircraft systems (UAS, aka drones) provide a potentially useful technique for assessing marine animal populations, but a central challenge lies in analysing the vast amounts of data generated in the images or video acquired during each flight. Neural networks are emerging as a powerful tool for automating object detection across data domains and can be applied to UAS imagery to generate new population‐level insights. To explore the utility of these emerging technologies in a challenging field setting, we used neural networks to enumerate olive ridley turtles Lepidochelys olivacea in drone images acquired during a mass‐nesting event on the coast of Ostional, Costa Rica. Results revealed substantial promise for this approach; specifically, our model detected 8% more turtles than manual counts while effectively reducing the manual validation burden from 2,971,554 to 44,822 image windows. Our detection pipeline was trained on a relatively small set of turtle examples (N = 944), implying that this method can be easily bootstrapped for other applications, and is practical with real‐world UAS datasets. Our findings highlight the feasibility of combining UAS and neural networks to estimate population levels of diverse marine animals and suggest that the automation inherent in these techniques will soon permit monitoring over spatial and temporal scales that would previously have been impractical.
Very high-resolution satellite imagery (≤5 m resolution) has become available on a spatial and temporal scale appropriate for dynamic wetland management and conservation across large areas. Estuarine wetlands have the potential to be mapped at a detailed habitat scale with a frequency that allows immediate monitoring after storms, in response to human disturbances, and in the face of sea-level rise. Yet mapping requires significant fieldwork to run modern classification algorithms and estuarine environments can be difficult to access and are environmentally sensitive. Recent advances in unoccupied aircraft systems (UAS, or drones), coupled with their increased availability, present a solution. UAS can cover a study site with ultra-high resolution (<5 cm) imagery allowing visual validation. In this study we used UAS imagery to assist training a Support Vector Machine to classify WorldView-3 and RapidEye satellite imagery of the Rachel Carson Reserve in North Carolina, USA. UAS and field-based accuracy assessments were employed for comparison across validation methods. We created and examined an array of indices and layers including texture, NDVI, and a LiDAR DEM. Our results demonstrate classification accuracy on par with previous extensive fieldwork campaigns (93% UAS and 93% field for WorldView-3; 92% UAS and 87% field for RapidEye). Examining change between 2004 and 2017, we found drastic shoreline change but general stability of emergent wetlands. Both WorldView-3 and RapidEye were found to be valuable sources of imagery for habitat classification with the main tradeoff being WorldView’s fine spatial resolution versus RapidEye’s temporal frequency. We conclude that UAS can be highly effective in training and validating satellite imagery.
The flourishing application of drones within marine science provides more opportunity to conduct photogrammetric studies on large and varied populations of many different species. While these new platforms are increasing the size and availability of imagery datasets, established photogrammetry methods require considerable manual input, allowing individual bias in techniques to influence measurements, increasing error and magnifying the time required to apply these techniques. Here, we introduce the next generation of photogrammetry methods utilizing a convolutional neural network to demonstrate the potential of a deep learning‐based photogrammetry system for automatic species identification and measurement. We then present the same data analysed using conventional techniques to validate our automatic methods. Our results compare favorably across both techniques, correctly predicting whale species with 98% accuracy (57/58) for humpback whales, minke whales, and blue whales. Ninety percent of automated length measurements were within 5% of manual measurements, providing sufficient resolution to inform morphometric studies and establish size classes of whales automatically. The results of this study indicate that deep learning techniques applied to survey programs that collect large archives of imagery may help researchers and managers move quickly past analytical bottlenecks and provide more time for abundance estimation, distributional research, and ecological assessments.
It is increasingly important to understand the extent and health of coastal natural resources in the face of anthropogenic and climate-driven changes. Coastal ecosystems are difficult to efficiently monitor due to the inability of existing remotely sensed data to capture complex spatial habitat patterns. To help managers and researchers avoid inefficient traditional mapping efforts, we developed a deep learning tool (OysterNet) that uses unoccupied aircraft systems (UAS) imagery to automatically detect and delineate oyster reefs, an ecosystem that has proven problematic to monitor remotely. OysterNet is a convolutional neural network (CNN) that assesses intertidal oyster reef extent, yielding a difference in total area between manual and automated delineations of just 8%, attributable in part to OysterNet's ability to detect oysters overlooked during manual demarcation. Further training of OysterNet could enable assessments of oyster reef heights and densities, and incorporation of more coastal habitat types. Future iterations will be applied to high-resolution satellite data for effective management at larger scales.
Population monitoring of colonial seabirds is often complicated by the large size of colonies, remote locations, and close inter- and intra-species aggregation. While drones have been successfully used to monitor large inaccessible colonies, the vast amount of imagery collected introduces a data analysis bottleneck. Convolutional neural networks (CNN) are evolving as a prominent means for object detection and can be applied to drone imagery for population monitoring. In this study, we explored the use of these technologies to increase capabilities for seabird monitoring by using CNNs to detect and enumerate Black-browed Albatrosses (Thalassarche melanophris) and Southern Rockhopper Penguins (Eudyptes c. chrysocome) at one of their largest breeding colonies, the Falkland (Malvinas) Islands. Our results showed that these techniques have great potential for seabird monitoring at significant and spatially complex colonies, producing accuracies of correctly detecting and counting birds at 97.66% (Black-browed Albatrosses) and 87.16% (Southern Rockhopper Penguins), with 90% of automated counts being within 5% of manual counts from imagery. The results of this study indicate CNN methods are a viable population assessment tool, providing opportunities to reduce manual labor, cost, and human error.
Ocean physics and biology can interact in myriad and complex ways. Eddies, features found at many scales in the ocean, can drive substantial changes in physical and biogeochemical fields with major implications for marine ecosystems. Mesoscale eddies are challenging to model and difficult to observe at sea due to their fine-scale variability yet broad extent. In this work we observed a frontal eddy just north of Cape Hatteras via an intensive hydrographic, biogeochemical, and optical sampling campaign. Frontal eddies occur in western boundary currents around the globe and there are major gaps in our understanding of their ecosystem impacts. In the Gulf Stream, frontal eddies have been studied in the South Atlantic Bight, where they are generally assumed to shear apart passing Cape Hatteras. However, we found that the observed frontal eddy had different physical properties and phytoplankton community composition from adjacent water masses, in addition to continued cyclonic rotation. In this work we first synthesize the overall ecological impacts of frontal eddies in a simple conceptual model. This conceptual model led to the hypothesis that frontal eddies could be well timed to supply zooplankton to secondary consumers off Cape Hatteras where there is a notably high concentration and diversity of top predators. Towards testing this hypothesis and our conceptual model we report on the biogeochemical state of this particular eddy connecting physical and biological dynamics, analyze how it differs from Gulf Stream and shelf waters even in death, and refine our initial model with this new data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.