2014
DOI: 10.1007/978-3-319-13817-6_11
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Feature Learning for Human Activity Recognition Using Smartphone Sensors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
37
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 66 publications
(37 citation statements)
references
References 5 publications
0
37
0
Order By: Relevance
“…Sensor Modality Deep Model Application Dataset (Almaslukh et al, 2017) Body-worn SAE ADL D03 (Alsheikh et al, 2016) Body-worn RBM ADL, factory, Parkinson D02, D06, D14 Body-worn, ambiemt RBM Gesture, ADL, transportation Self, D01 (Chen and Xue, 2015) Body-worn CNN ADL Self (Chen et al, 2016b) Body-worn CNN ADL D06 (Cheng and Scotland, 2017) Body-worn DNN Parkinson Self (Edel and Köppe, 2016) Body-worn RNN ADL D01, D04, Self (Fang and Hu, 2014) Object, ambient DBN ADL Self (Gjoreski et al, 2016) Body-worn CNN ADL Self, D01 (Guan and Ploetz, 2017) Body-worn, object, ambient RNN ADL, smart home D01, D02, D04 (Ha et al, 2015) Body-worn CNN Factory, health D02, D13 (Ha and Choi, 2016) Body-worn CNN ADL, health D13 (Hammerla et al, 2015) Body-worn RBM Parkinson Self (Hammerla et al, 2016) Body-worn, object, ambient DNN, CNN, RNN ADL, smart home, gait D01, D04, D14 (Hannink et al, 2017) Body-worn CNN Gait Self (Hayashi et al, 2015) Body-worn, ambient RBM ADL, smart home D16 (Inoue et al, 2016) Body-worn RNN ADL D16 (Jiang and Yin, 2015) Body-worn CNN ADL D03, D05, D11 (Khan et al, 2017) Ambient CNN Respiration Self (Kim and Toomajian, 2016) Ambient CNN Hand gesture Self (Kim and Li, 2017) Body-worn CNN ADL Self Body-worn, ambient RBM ADL, emotion Self Ambient RBM ADL Self (Lee et al, 2017) Body-worn CNN ADL Self (Li et al, 2016a) Object RBM Patient resuscitation Self (Li et al, 2016b) Object CNN Patient resuscitation Self (Li et al, 2014) Body-worn SAE ADL D03 Body-worn CNN, RBM ADL Self (Mohammed and Tashev, 2017) Body-worn CNN ADL, gesture Self (Morales and Roggen, 2016) Body-worn CNN ADL, smart home D01, D02 (Murad and Pyun, 2017) Body-worn RNN ADL, smart home D01, D02, D05, D14 (Ordóñez and Roggen, 2016) Body-worn CNN, RNN ADL, gesture, posture, factory D01, D02 (Panwar et al, 2017) Body-worn CNN ADL Self (Plötz et al, 2011) Body-worn, object RBM ADL, food preparation, factory D01, D02, D08, D14…”
Section: Literaturementioning
confidence: 99%
“…Sensor Modality Deep Model Application Dataset (Almaslukh et al, 2017) Body-worn SAE ADL D03 (Alsheikh et al, 2016) Body-worn RBM ADL, factory, Parkinson D02, D06, D14 Body-worn, ambiemt RBM Gesture, ADL, transportation Self, D01 (Chen and Xue, 2015) Body-worn CNN ADL Self (Chen et al, 2016b) Body-worn CNN ADL D06 (Cheng and Scotland, 2017) Body-worn DNN Parkinson Self (Edel and Köppe, 2016) Body-worn RNN ADL D01, D04, Self (Fang and Hu, 2014) Object, ambient DBN ADL Self (Gjoreski et al, 2016) Body-worn CNN ADL Self, D01 (Guan and Ploetz, 2017) Body-worn, object, ambient RNN ADL, smart home D01, D02, D04 (Ha et al, 2015) Body-worn CNN Factory, health D02, D13 (Ha and Choi, 2016) Body-worn CNN ADL, health D13 (Hammerla et al, 2015) Body-worn RBM Parkinson Self (Hammerla et al, 2016) Body-worn, object, ambient DNN, CNN, RNN ADL, smart home, gait D01, D04, D14 (Hannink et al, 2017) Body-worn CNN Gait Self (Hayashi et al, 2015) Body-worn, ambient RBM ADL, smart home D16 (Inoue et al, 2016) Body-worn RNN ADL D16 (Jiang and Yin, 2015) Body-worn CNN ADL D03, D05, D11 (Khan et al, 2017) Ambient CNN Respiration Self (Kim and Toomajian, 2016) Ambient CNN Hand gesture Self (Kim and Li, 2017) Body-worn CNN ADL Self Body-worn, ambient RBM ADL, emotion Self Ambient RBM ADL Self (Lee et al, 2017) Body-worn CNN ADL Self (Li et al, 2016a) Object RBM Patient resuscitation Self (Li et al, 2016b) Object CNN Patient resuscitation Self (Li et al, 2014) Body-worn SAE ADL D03 Body-worn CNN, RBM ADL Self (Mohammed and Tashev, 2017) Body-worn CNN ADL, gesture Self (Morales and Roggen, 2016) Body-worn CNN ADL, smart home D01, D02 (Murad and Pyun, 2017) Body-worn RNN ADL, smart home D01, D02, D05, D14 (Ordóñez and Roggen, 2016) Body-worn CNN, RNN ADL, gesture, posture, factory D01, D02 (Panwar et al, 2017) Body-worn CNN ADL Self (Plötz et al, 2011) Body-worn, object RBM ADL, food preparation, factory D01, D02, D08, D14…”
Section: Literaturementioning
confidence: 99%
“…In terms of transfer learning, our approach also di ers signi cantly from some earlier a empts [44,69] that were concerned with features transferability from a fully-supervised model learned from inertial measurement units data, as our approach utilizes widely available smartphones and does not require labeled data. Finally, the proposed technique is also di erent from previously studied unsupervised pre-training methods such as autoencoders [37], restricted Boltzmann machines [55] and sparse coding [9] as we employ an end-to-end (self) supervised learning paradigm on multiple surrogate tasks to extract features.…”
Section: Determining Representational Similaritymentioning
confidence: 99%
“…Different positions of the mobile phone will cause the direction of the axis to change. In [32] and [33], the authors both eliminate the possible rotational interference by synthesizing the acceleration without considering the direction of the axis, which partly takes into account the relations between the three axes. By converting the time-series signal of the sensor into an active image containing the hidden relations between axes, the recognition accuracy of the model is obviously improved [14].…”
Section: Related Workmentioning
confidence: 99%