2020
DOI: 10.1109/access.2020.3003162
|View full text |Cite
|
Sign up to set email alerts
|

Harmonic Loss Function for Sensor-Based Human Activity Recognition Based on LSTM Recurrent Neural Networks

Abstract: Human activity recognition (HAR) has been a very popular field in both real practice and theoretical research. Over the years, a number of many-vs-one Long Short-Term Memory (LSTM) models have been proposed for the sensor-based HAR problem. However, how to utilize sequence outputs of them to improve the HAR performance has not been studied seriously. To solve this problem, we present a novel loss function named harmonic loss, which is utilized to improve the overall classification performance of HAR based on b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(12 citation statements)
references
References 46 publications
0
12
0
Order By: Relevance
“…In an effort to address these challenges, long short-term memory (LSTM)based RNNs [195], and Gated Recurrent Units (GRUs) [196] are introduced to model temporal sequences and their broad dependencies. The GRU introduces a reset and update gate to control the flow of inputs to a cell [197][198][199][200][201]. The LSTM has been shown capable of memorizing and modelling the long-term dependency in data.…”
Section: Recurrent Neural Network (Rnn)mentioning
confidence: 99%
See 1 more Smart Citation
“…In an effort to address these challenges, long short-term memory (LSTM)based RNNs [195], and Gated Recurrent Units (GRUs) [196] are introduced to model temporal sequences and their broad dependencies. The GRU introduces a reset and update gate to control the flow of inputs to a cell [197][198][199][200][201]. The LSTM has been shown capable of memorizing and modelling the long-term dependency in data.…”
Section: Recurrent Neural Network (Rnn)mentioning
confidence: 99%
“…Residual connections do not impede gradients and could help to refine the output of layers. For example, [200] proposes a harmonic loss function and [207] combines LSTM with batch normalization to achieve 92% accuracy with raw accelerometer and gyroscope data. Ref.…”
Section: Recurrent Neural Network (Rnn)mentioning
confidence: 99%
“…RNNs have been widely used in language modeling, objective detection, and speech recognition and have received increased attention in biomechanics. RNNs are often used to construct sequence-to-sequence models and have been applied to HAR tasks because of RNNs' competence in temporal domain processing [24,[33][34][35][36]. RNNs can transition between hidden states of previous time steps to subsequent time steps, processing and passing down time-dependent sequences in chronological order.…”
Section: Machine Learning Algorithmmentioning
confidence: 99%
“…Sci. 2021, 11, x FOR PEER REVIEW 6 of 13 because of RNNs' competence in temporal domain processing [24,[33][34][35][36]. RNNs can transition between hidden states of previous time steps to subsequent time steps, processing and passing down time-dependent sequences in chronological order.…”
Section: Machine Learning Algorithmmentioning
confidence: 99%
“…To enhance the recognition process, feature level and late fusions are also investigated using two datasets. In [31], a novel loss function called harmonic loss, which is based on the label replication method to replicate true labels at each sequence step of LSTM models, is presented to improve the overall classification performance of sensor based HAR. This loss function not only takes all local sequence errors into accounts but also considers the relative importance of different local errors in the training.…”
Section: Related Workmentioning
confidence: 99%