Obstructive sleep apnea hypopnea syndrome (OSAHS) is a sleep-related respiratory disease, and sleep snoring is its most common and direct feature. However, the current snoring detection methods require a lot of medical manpower and medical equipment resources, resulting in many OSAHS patients can not be treated in time. Therefore, this paper proposes a snore detection method based on deep learning and a snore dataset. The detection method first calculates the time-domain waveform, spectrogram, and Mel-spectrogram for each audio segment in the snore dataset. Then, the snore is recognized by convolution neural network. To better apply this method to mobile devices and intelligent devices, MobileNetV2 is selected as the detection network to classify snoring and non-snoring images. The experimental results show that the proposed method can accurately recognize snores with 95.00% accuracy. And the spectrogram can better reflect the difference between snoring and non-snoring images.
Illegal construction should be detected as early as possible as it can damage the environment and economy. However, the existing methods for detecting illegal construction can be improved in terms of their detection cycles, accuracy, and speed. Moreover, there are relatively few valuable real-world image datasets for detecting illegal construction. To address these issues, a high-precision real-time detection model named YEMNet and a new large-scale dataset for detection of illegal construction objects (ICOS) are proposed herein. Our YEMNet is based on the You Only Look Once v3 object detection model; this model adopts a lightweight convolutional neural network called "EfficientNet" as the backbone for feature extraction. Then, YEMNet employs a new activation function Mish outside the backbone to achieve efficient optimization and strong generalization, thereby improving the recognition accuracy for ICOS in complicated scenes. Our proposed dataset comprises 15 categories and 13,701 photographs of ICOS captured under different conditions concerning weather, lighting, and natural scenes. Extensive experiments on the proposed dataset show that YEMNet achieves a mean average precision of 91.41% with fewer parameters, thereby outperforming state-of-the-art object detectors. Our dataset and code are available at https://github.com/king-king-king/ICOS-Dataset.
Metadata prefetching and data placement are essential to improve access performance for wide-area network file systems. Constructing efficient approaches has been a challenge for metadata prefetching in concurrent workload scenarios and for data placement in a wide-area network. This paper proposes efficient approaches for metadata prefetching and data placement, wherein fine-grained control of prefetching policies and variable-size data fragment writing are utilized to maximize the I/O bandwidth of distributed files. The proposed metadata prefetching approach dynamically detects the dominant workload and adaptively adjusts the prefetching policy to improve metadata access performance in concurrent workload scenarios. The proposed data placement approach places the written data fragments into the local data center to improve the write performance and sends only the location information of the data fragments to the remote data center that holds the original file. Experimental results using real system traces indicate that the file system metadata and application data access times can be reduced by up to 33.5% and 17.19%, respectively, compared to that of state-of-the-art methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.