2015
DOI: 10.1049/htl.2015.0017
|View full text |Cite
|
Sign up to set email alerts
|

Implementation study of wearable sensors for activity recognition systems

Abstract: This Letter investigates and reports on a number of activity recognition methods for a wearable sensor system. The authors apply three methods for data transmission, namely 'stream-based', 'feature-based' and 'threshold-based' scenarios to study the accuracy against energy efficiency of transmission and processing power that affects the mote's battery lifetime. They also report on the impact of variation of sampling frequency and data transmission rate on energy consumption of motes for each method. This study… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
5
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…In contrast to both simple and recognition tests, in the choice tests, HRT is estimated from multiple stimuli and it is expected the reacting individual makes a choice from among multiple responses. Several systems were developed and reported to estimate the reaction time of drivers using virtual reality time [8], computer-based measurement of cognitive functioning [9] and wearable systems [10][11][12]. The latter ones usually incorporate motion sensors mounted on the reacting individual.…”
mentioning
confidence: 99%
“…In contrast to both simple and recognition tests, in the choice tests, HRT is estimated from multiple stimuli and it is expected the reacting individual makes a choice from among multiple responses. Several systems were developed and reported to estimate the reaction time of drivers using virtual reality time [8], computer-based measurement of cognitive functioning [9] and wearable systems [10][11][12]. The latter ones usually incorporate motion sensors mounted on the reacting individual.…”
mentioning
confidence: 99%
“…This application enables us to record egocentric video and sensor data simultaneously in a synchronized manner. The following types of sensor data are supported on the glass: accelerometer, gravity, gyroscope, linear acceleration, 1 Dataset:http://people.sutd.edu.sg/ 1000892/dataset arXiv:1601.06603v1 [cs.MM] 25 Jan 2016 magnetic field and rotation vector. These sensors are integrated in the device and they can help capturing accurate head motion of individuals when they are performing different activities.…”
Section: Multimodal Egocentric Activity Datasetmentioning
confidence: 99%
“…Research on automatic egocentric activity recognition has been focusing on using two broad categories of data: low-dimensional sensor data and high-dimensional visual data. Low-dimensional sensor data such as GPS, light, temperature, direction or accelerometer data has been found to be useful for activity recognition [1,2,3,4,5]. [2] proposes features for egocentric activity recognition computed from cell-phone accelerometer data.…”
Section: Introductionmentioning
confidence: 99%
“…Complementary to vision data, inertial sensor data (e.g., gyroscopes and accelerometers) provide position and direction information of the wearable device, which may facilitate human activity recognition for egocentric videos. Recently, with the advancement and application of wearable inertial sensors, multi-modal methods, i.e., combining vision data and sensor data to recognize human activities, are of widespread interest, which may promote vision-based methods [16], [17], [18]. Some pioneering work [17] uses LSTM to learn the feature from sensor data and CNNs to learn the feature from vision data, which are fused together to predict wearer's activity.…”
Section: Introductionmentioning
confidence: 99%