We present a novel radio interference based sensor localization method for wireless sensor networks. The technique relies on a pair of nodes emitting radio waves simultaneously at slightly different frequencies. The carrier frequency of the composite signal is between the two frequencies, but has a very low frequency envelope. Neighboring nodes can measure the energy of the envelope signal as the signal strength. The relative phase offset of this signal measured at two receivers is a function of the distances between the four nodes involved and the carrier frequency. By making multiple measurements in an at least 8-node network, it is possible to reconstruct the relative location of the nodes in 3D. Our prototype implementation on the MICA2 platform yields an average localization error as small as 3 cm and a range of up to 160 meters. In addition to this high precision and long range, the other main advantage of the Radio Interferometric Positioning System (RIPS) is the fact that it does not require any sensors other than the radio used for wireless communication.
An ad-hoc wireless sensor network-based system is presented that detects and accurately locates shooters even in urban environments. The localization accuracy of the system in open terrain is competitive with that of existing centralized countersniper systems. However, the presented sensor network-based solution surpasses the traditional approach because it can mitigate acoustic multipath effects prevalent in urban areas and it can also resolve multiple simultaneous shots. These unique characteristics of the system are made possible by employing novel sensor fusion techniques that utilize the spatial and temporal diversity of multiple detections. In this article, in addition to the overall system architecture, the middleware services and the unique sensor fusion algorithms are described. An analysis of the experimental data gathered during field trials at US military facilities is also presented.
The paper presents a wireless sensor network-based mobile countersniper system. A sensor node consists of a helmetmounted microphone array, a COTS MICAz mote for internode communication and a custom sensorboard that implements the acoustic detection and Time of Arrival (ToA) estimation algorithms on an FPGA. A 3-axis compass provides self orientation and Bluetooth is used for communication with the soldier's PDA running the data fusion and the user interface. The heterogeneous sensor fusion algorithm can work with data from a single sensor or it can fuse ToA or Angle of Arrival (AoA) observations of muzzle blasts and ballistic shockwaves from multiple sensors. The system estimates the trajectory, the range, the caliber and the weapon type. The paper presents the system design and the results from an independent evaluation at the US Army Aberdeen Test Center. The system performance is characterized by 1-degree trajectory precision and over 95% caliber estimation accuracy for all shots, and close to 100% weapon estimation accuracy for 4 out of 6 guns tested. Keywords: Sensor networks, data fusion, acoustic source localization, weapon classification, caliber estimation Acknowledgments: The Darpa IpTO ASSIST program has supported this research. We'd like to express our gratitude to Brian A. Weiss, Craig Schlenoff and the US Army Aberdeen Test Center who carried out the independent evaluation of the system. We are also grateful to Miklós Maróti, Gyula Simon, Branislav Kusý, Béla Fehér, Sebestyén Dóra, Ken Pence, Ted Bapty, Jason Scott and the Nashville Police Academy for their contributions. We'd like to thank our shepherd, Dan Siewiorek for his constructive comments. Categories and Subject Descriptors
(1) Background: Low back disorders are a leading cause of missed work and physical disability in manual material handling due to repetitive lumbar loading and overexertion. Ergonomic assessments are often performed to understand and mitigate the risk of musculoskeletal overexertion injuries. Wearable sensor solutions for monitoring low back loading have the potential to improve the quality, quantity, and efficiency of ergonomic assessments and to expand opportunities for the personalized, continuous monitoring of overexertion injury risk. However, existing wearable solutions using a single inertial measurement unit (IMU) are limited in how accurately they can estimate back loading when objects of varying mass are handled, and alternative solutions in the scientific literature require so many distributed sensors that they are impractical for widespread workplace implementation. We therefore explored new ways to accurately monitor low back loading using a small number of wearable sensors. (2) Methods: We synchronously collected data from laboratory instrumentation and wearable sensors to analyze 10 individuals each performing about 400 different material handling tasks. We explored dozens of candidate solutions that used IMUs on various body locations and/or pressure insoles. (3) Results: We found that the two key sensors for accurately monitoring low back loading are a trunk IMU and pressure insoles. Using signals from these two sensors together with a Gradient Boosted Decision Tree algorithm has the potential to provide a practical (relatively few sensors), accurate (up to r2 = 0.89), and automated way (using wearables) to monitor time series lumbar moments across a broad range of material handling tasks. The trunk IMU could be replaced by thigh IMUs, or a pelvis IMU, without sacrificing much accuracy, but there was no practical substitute for the pressure insoles. The key to realizing accurate lumbar load estimates with this approach in the real world will be optimizing force estimates from pressure insoles. (4) Conclusions: Here, we present a promising wearable solution for the practical, automated, and accurate monitoring of low back loading during manual material handling.
There are tremendous opportunities to advance science, clinical care, sports performance, and societal health if we are able to develop tools for monitoring musculoskeletal loading (e.g., forces on bones or muscles) outside the lab. While wearable sensors enable non-invasive monitoring of human movement in applied situations, current commercial wearables do not estimate tissue-level loading on structures inside the body. Here we explore the feasibility of using wearable sensors to estimate tibial bone force during running. First, we used lab-based data and musculoskeletal modeling to estimate tibial force for ten participants running across a range of speeds and slopes. Next, we converted lab-based data to signals feasibly measured with wearables (inertial measurement units on the foot and shank, and pressure-sensing insoles) and used these data to develop two multi-sensor algorithms for estimating peak tibial force: one physics-based and one machine learning. Additionally, to reflect current running wearables that utilize running impact metrics to infer musculoskeletal loading or injury risk, we estimated tibial force using a commonly measured impact metric, the ground reaction force vertical average loading rate (VALR). Using VALR to estimate peak tibial force resulted in a mean absolute percent error of 9.9%, which was no more accurate than a theoretical step counter that assumed the same peak force for every 2 running stride. Our physics-based algorithm reduced error to 5.2%, and our machine learning algorithm reduced error to 2.6%. Further, to gain insights into how force estimation accuracy relates to overuse injury risk, we computed bone damage expected due to a given loading cycle. We found that modest errors in tibial force translated into large errors in bone damage estimates. For example, a 9.9% error in tibial force using VALR translated into 104% error in estimated bone damage. Encouragingly, the physics-based and machine learning algorithms reduced damage errors to 41% and 18%, respectively. This study highlights the exciting potential to combine wearables, musculoskeletal biomechanics and machine learning to develop more accurate tools for monitoring musculoskeletal loading in applied situations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.