Abstract:We have successfully developed novel surface-enhanced Raman scattering (SERS) substrates with three-dimensional (3D) porous structures for effectively improving the sensitivity and reproducibility of SERS, which can rapidly detect small molecules (rhodamine 6G as an example). Periodical arrays of the honeycomb-like substrates were fabricated by self-assembling polyurethane-co-azetidine-2,4-dione (PU-PAZ) polymers. PU-PAZ comprising amphiphilic dendrons could stabilize the phase separation between the water droplets and polymer solution, and then organize into regular porous structures during the breath figure method. Subsequently, SERS substrates were fabricated by immobilizing gold nanoparticles (AuNPs) onto the honeycomb-like films with various 3D porous structures, controlled by the different PU-PAZ concentrations and relative humidities. Results show that surface enhancement factors of honeycomb-like substrates were 20 times higher than that of flat-film substrates (control group) due to enormous hot-spots resonance effects by the 3D porous structure, verified through Raman mapping at various positions of the z-axis. Furthermore, the particle size effects were evaluated by immobilized 12 and 67 nm of AuNPs on the honeycomb-like substrates, indicating larger AuNPs could induce more pronounced hot-spots effects. The generation of hot-spots resonance to enhance Raman intensity is strongly dependent on the diameter of AuNPs and the pore size of the honeycomb-like and 3D porous substrates for label-free and rapid SERS detection.
Global climate change has had a drastic impact on our environment. Previous study showed that pest disaster occured from global climate change may cause a tremendous number of trees died and they inevitably became a factor of forest fire. An important portent of the forest fire is the condition of forests. Aerial image-based forest analysis can give an early detection of dead trees and living trees. In this paper, we applied a synthetic method to enlarge imagery dataset and present a new framework for automated dead tree detection from aerial images using a retrained Mask RCNN (Mask Region-based Convolutional Neural Network) approach, with a transfer learning scheme. We apply our framework to our aerial imagery datasets,and compare eight fine-tuned models. The mean average precision score (mAP) for the best of these models reaches 54%. Following the automated detection, we are able to automatically produce and calculate number of dead tree masks to label the dead trees in an image, as an indicator of forest health that could be linked to the causal analysis of environmental changes and the predictive likelihood of forest fire. INDEX TERMS Deep learning, aerial imaging, remote sensing, forest health diagnosis, climate change, forest fire
Intelligent transport systems (ITS) are pivotal in the development of sustainable and green urban living. ITS is data-driven and enabled by the profusion of sensors ranging from pneumatic tubes to smart cameras which are used to detect and categorise passing vehicles. Simple sensors, such as pneumatic tubes, are successfully deployed for counting passing vehicles but are not useful for vehicle tracking or re-identification. Smart cameras, on the other hand, collect comprehensive information but suffer from occlusion, patchy coverage, and compromised vision in adverse weather and visibility. This work explores a novel ITS data source based on optical fibre which acts as uninterrupted length of virtual sensors using a distributed acoustic sensor (DAS) system. Based on real DAS data collected in the field, we first present a study of latent DAS features that uniquely identify a given vehicle, otherwise referred to as vehicle signature. We formulate a classification problem that examines incoming DAS data to extract vehicle signatures and identify the different types of vehicle. To this end, we implement different classification methods and present a comparative performance analysis that reveals novel insights into the potential role of DAS in ITS applications. This work is a pilot study of DAS for vehicle classification that is driven by real-DAS data and validated by promising results where a vehicle type is correctly identified with 94% accuracy and the size of a vehicle with 95% accuracy.INDEX TERMS Intelligent transport system (ITS), distributed acoustic sensors (DAS), classification, vehicle type
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.