2016
DOI: 10.3390/info7040072
|View full text |Cite
|
Sign up to set email alerts
|

PACP: A Position-Independent Activity Recognition Method Using Smartphone Sensors

Abstract: Abstract:Human activity recognition has been a hot topic in recent years. With the advances in sensor technology, there has been a growing interest in using smartphones equipped with a set of built-in sensors to solve tasks of activity recognition. However, in most previous studies, smartphones were used with a fixed position-like trouser pockets-during recognition, which limits the user behavior. In the position-independent cases, the recognition accuracy is not very satisfactory. In this paper, we studied hu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
17
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 45 publications
(17 citation statements)
references
References 29 publications
(36 reference statements)
0
17
0
Order By: Relevance
“…Examination of the proposed SLR approach on 107 people and 31 h of recorded data. The number of different people who participate in this is dataset is 6 times more than any other dataset used in SLR (17 people in [19]) and the recording time is 12 times more than any other dataset used in Sensors 2020, 20, 214 3 of 20 SLR (about 160 min in [22]). This dataset was partly generated for this research and partly uses other publicly available datasets created for other applications but are suitable for the SLR task.…”
mentioning
confidence: 99%
“…Examination of the proposed SLR approach on 107 people and 31 h of recorded data. The number of different people who participate in this is dataset is 6 times more than any other dataset used in SLR (17 people in [19]) and the recording time is 12 times more than any other dataset used in Sensors 2020, 20, 214 3 of 20 SLR (about 160 min in [22]). This dataset was partly generated for this research and partly uses other publicly available datasets created for other applications but are suitable for the SLR task.…”
mentioning
confidence: 99%
“…The position specified activity classifiers refer to training a classifier for each position, and selecting the activity classifier according to the obtained position information in the first stage [1,5,6]. By contrast, one generalized activity classifier for final HAR takes the features produced by an adjustment technique as input [7]. The adjustment technique narrows the feature difference among smartphone positions.…”
Section: Position Recognition and Position-aware Human Activity Recogmentioning
confidence: 99%
“…Some related works studied position recognition methods while the smartphone users are walking [2][3][4], but without discussing other periodic human activities. The position-ware HAR model is designed to leverage smartphone on-body position information as prior knowledge to improve HAR [1,[5][6][7].…”
Section: Introductionmentioning
confidence: 99%
“…Most of these works use wearable accelerometers, or accelerometer sensors of smartphones. It is evident that works are done in different directions, not only detecting detailed daily life activities and fall [7,19], but also on online activity recognition [20], publishing benchmark datasets [21] as well as analyzing different usage behaviour as in [18,22]. Smartphone placed in pocket Accelerometer Achieved 90% accuracy, uses J48, logistic regression, multi layer perceptron.…”
Section: Related Workmentioning
confidence: 99%
“…Several sensors are used for activity recognition in order to make the recognition system device independent in [17]. A recent work in [18] addresses different usage behavior like smartphones kept at a coat pocket or bag. However, no work could be found that enables detailed activity recognition even when training data is collected using one device at one position (say, right trouser pocket) and activity is recognized for test data collected from a different device kept at the same or different position (say, shirt pocket).…”
Section: Introductionmentioning
confidence: 99%