2015
DOI: 10.3390/mi6081100
|View full text |Cite
|
Sign up to set email alerts
|

Activity Recognition Using Fusion of Low-Cost Sensors on a Smartphone for Mobile Navigation Application

Abstract: Low-cost inertial and motion sensors embedded on smartphones have provided a new platform for dynamic activity pattern inference. In this research, a comparison has been conducted on different sensor data, feature spaces and feature selection methods to increase the efficiency and reduce the computation cost of activity recognition on the smartphones. We evaluated a variety of feature spaces and a number of classification algorithms from the area of Machine Learning, including Naive Bayes, Decision Trees, Arti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
39
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
3
3
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 40 publications
(40 citation statements)
references
References 63 publications
0
39
0
Order By: Relevance
“…In this work, we only investigate the most common four device carrying positions [17][18][19][20], namely, held in hand (hand-held), against ear during phone call (phone-call), placed in trouser pocket (in-pocket), and swinging in hand (swinging-hand). For pedestrian activities, we only consider two situations, namely, normal walking and standing, which may be easily recognized by their different acceleration patterns [14,15]. For simplicity, we assume that the two activities are already accurately recognized.…”
Section: System Overviewmentioning
confidence: 99%
See 1 more Smart Citation
“…In this work, we only investigate the most common four device carrying positions [17][18][19][20], namely, held in hand (hand-held), against ear during phone call (phone-call), placed in trouser pocket (in-pocket), and swinging in hand (swinging-hand). For pedestrian activities, we only consider two situations, namely, normal walking and standing, which may be easily recognized by their different acceleration patterns [14,15]. For simplicity, we assume that the two activities are already accurately recognized.…”
Section: System Overviewmentioning
confidence: 99%
“…Finally, the estimation results are further smoothed by filtering outliers caused by hand movements undetected or hand movements misclassified as user turns. For simplicity, in this work, we only consider the two main pedestrian activities, namely, walking and standing, and assume that they have been already accurately recognized according to their different acceleration patterns [14,15].…”
Section: Introductionmentioning
confidence: 99%
“…Johnson and Trivedi (2011) have used accelerometers to determine the driving style. In general, such low-cost sensors are suitable for a wide variety of activity recognition applications, as shown by Saedi and El-Sheimy (2015).…”
Section: Sensing Accelerationmentioning
confidence: 99%
“…However, there is still a gap between raw data collected from mobile sensors and suitable context information. The context-aware algorithm uses semantic modelling and machine learning techniques to automatically recognize the contexts from heterogeneous and noisy sensor data [3]. The current challenge in recommendation systems is to design a pervasive, real-time service which is adaptable to user's modes and context.…”
Section: Introductionmentioning
confidence: 99%
“…The appropriate set of sensors and features is carefully selected to perform real-time and accurate activity recognition. The details of a mobile application and performance analysis of different classification techniques is already stated in the previous publications [2][3][4]. The restaurant of interest can be chosen by the user from the recommended list, followed by an illustration of the optimal route and travel mode via the launching of the Google Maps app.…”
Section: Introductionmentioning
confidence: 99%