Study Objective: Validate a novel method for sleep-wake staging in mice using noninvasive electric field (EF) sensors.Methods: Mice were implanted with electroencephalogram (EEG) and electromyogram (EMG) electrodes and housed individually. Noninvasive EF sensors were attached to the exterior of each chamber to record respiration and other movement simultaneously with EEG, EMG, and video. A sleep-wake scoring method based on EF sensor data was developed with reference to EEG/EMG and then validated by three expert scorers. Additionally, novice scorers without sleep-wake scoring experience were self-trained to score sleep using only the EF sensor data, and results were compared to those from expert scorers. Lastly, ability to capture three-state sleep-wake staging with EF sensors attached to traditional mouse home-cages was tested.Results: EF sensors quantified wake, rapid eye movement (REM) sleep, and non-REM sleep with high agreement (>93%) and comparable inter-and intra-scorer error as EEG/EMG. Novice scorers successfully learned sleep-wake scoring using only EF sensor data and scoring criteria, and *
29Background: Rodent sleep scoring in principally reliant on electroencephalogram (EEG) and 30 electromyogram (EMG), but this approach is invasive, can be expensive, and requires expertise 31 and specialized equipment. Affordable, simple to use, and noninvasive ways to accurately 32 quantify rodent sleep are needed. 34New method: We developed and validated a new method for sleep-wake staging in mice using 35 cost-effective, noninvasive electric field (EF) sensors that detect respiration and other 36 movements. We validated recordings from EF sensors attached to the exterior of specialty 37 chambers used to continuously capture sleep with EEG/EMG, then compared this to EF sensors 38 attached to vivarium home-cages. 39 40 Results: EF sensors quantified 3-state sleep architecture (wake, rapid eye movement -REM -41 sleep, and non-REM sleep) with high agreement (>93%) and comparable inter-and intra-scorer 42 error as expert EEG/EMG scoring. Novices given an instruction document with examples were 43 able to score sleep comparable to expert scorers (>91% agreement). Additionally, EF sensors 44 were able to quantify 3-state sleep scoring in traditional mouse home cages. 45 46 Comparison with existing method: Most noninvasive sleep assessment technology requires 47 animal contact, altered cage environments, and/or can only discern 2 states of arousal (wake or 48 asleep). The EF sensors are able to discriminate REM from non-REM sleep accurately and from 49 outside the animal's home cage.50 51 Conclusions: EF sensors provide a simple and reliable method to accurately score 3-state sleep 52 architecture; (i) from outside the typical home cage, (ii) where noninvasive approaches are 53 preferred, or (iii) which EEG/EMG is not possible.54 3 55 56 Graphical Abstract: 57 58 59 60 Accurately characterizing sleep-wake is crucial to understanding its impact on health, cognition, 61 and injury recovery. 1-3 Sleep staging in humans, relies on substantial non-invasive 62 instrumentation (polysomnography) including electroencephalogram (EEG), electromyogram 63 (EMG) of chin and limbs, electrooculogram (EOG) to detect rapid and rolling eye movements, 64 along with chest wall or air flow movement, oximetry, and video. 4-6 However, rodent studies 65 typically rely on fewer signals, invasive implanted EEG and EMG, less commonly invasive EOG 66 electrodes, and video analysis. Invasive surgical implants can result in weight loss, extensive 67 recovery, and commonly requires use of a tether cable. 7-9 Moreover, surgical expertise, limited 68 locations for electrode placement, and specialized equipment required to collect EEG and EMG 69 data may restrict inclusion of sleep analysis in an experimental design. For these reasons, there 70 is a need to develop noninvasive methods to assess sleep in preclinical rodent models. 71 72 Most noninvasive rodent sleep assessment methods center around measures of gross body 73 movement. Several video analysis techniques have been used to distinguish between 2-state 74 conditions (wake or asleep), but are u...
Speech (syllable) rate estimation typically involves computing a feature contour based on sub-band energies having strong local maxima/peaks at syllable nuclei, which are detected with the help of voicing decisions (VDs). While such a two-stage scheme works well in clean conditions, the estimated speech rate becomes less accurate in noisy condition particularly due to erroneous VDs and non-informative sub-bands mainly at low signal-to-noise ratios (SNR). This work proposes a technique to use VDs in the peak detection strategy in an SNR dependent manner. It also proposes a data-driven sub-band pruning technique to improve syllabic peaks of the feature contour in the presence of noise. Further, this paper generalizes both the peak detection and the sub-band pruning technique for unknown noise and/or unknown SNR conditions. Experiments are performed in clean and 20, 10, and 0 dB SNR conditions separately using Switchboard, TIMIT, and CTIMIT corpora under five additive noises: white, car, high-frequency-channel, cockpit, and babble. Experiments are also carried out in test conditions at unseen SNRs of −5 and 5 dB with four unseen additive noises: factory, sub-way, street, and exhibition. The proposed method outperforms the best of the existing techniques in clean and noisy conditions for three corpora.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.