2023
DOI: 10.1088/1361-6579/ad0ab8
|View full text |Cite
|
Sign up to set email alerts
|

MAG-Res2Net: a novel deep learning network for human activity recognition

Hanyu Liu,
Boyang Zhao,
Chubo Dai
et al.

Abstract: Objective. Human activity recognition (HAR) has become increasingly important in healthcare, sports, and fitness domains due to its wide range of applications. However, existing deep learning based HAR methods often overlook the challenges posed by the diversity of human activities and data quality, which can make feature extraction difficult. To address these issues, we propose a new neural network model called MAG-Res2Net, which incorporates the Borderline-SMOTE data upsampling algorithm, a loss function com… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 45 publications
(43 reference statements)
0
3
0
Order By: Relevance
“…Recently, modified and extended ResNet architectures specifically aimed at human activity recognition were proposed, including MAG-Res2Net [5] and an architecture [6] based on ResNeXt [36]. Whereas these architectures have shown promising results on the HAR data sets on which they were tested, the architecture proposed by Mekruksavanich et al [6] does not compare ResNeXt with ResNet, and both publications were not available at the time of our study, leading us to use the regular ResNet architecture in this work.…”
Section: Residual Network (Resnet)mentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, modified and extended ResNet architectures specifically aimed at human activity recognition were proposed, including MAG-Res2Net [5] and an architecture [6] based on ResNeXt [36]. Whereas these architectures have shown promising results on the HAR data sets on which they were tested, the architecture proposed by Mekruksavanich et al [6] does not compare ResNeXt with ResNet, and both publications were not available at the time of our study, leading us to use the regular ResNet architecture in this work.…”
Section: Residual Network (Resnet)mentioning
confidence: 99%
“…Although most commonly used for visual tasks, convolutional neural networks (CNNs) have been shown to produce competitive results on time series classification tasks [3], including human activity recognition (HAR) [4][5][6]. In the context of fitness activity recognition, they have been successfully applied to various different activities, such as swing sports [7][8][9], skiing [10,11], beach volleyball [12], football [13], and exercising [14].…”
Section: Introductionmentioning
confidence: 99%
“…In the time domain, current research primarily focuses on sensor signal-based HAR using DL techniques. These techniques include CNN [ 20 , 23 , 26 , 29 , 40 ], variants of RNNs like LSTM and GRU, and hybrid DL methods [ 19 , 27 , 28 , 41 , 42 , 43 ]. Authors such as Yan et al [ 40 ], Cheng et al [ 26 ], and Wang et al [ 23 ] have introduced supporting techniques like the attention layer and convolution layers with various kernel sizes to enhance CNNs for HAR, thereby modifying the original CNN model architectures.…”
Section: Related Workmentioning
confidence: 99%
“…Authors such as Yan et al [ 40 ], Cheng et al [ 26 ], and Wang et al [ 23 ] have introduced supporting techniques like the attention layer and convolution layers with various kernel sizes to enhance CNNs for HAR, thereby modifying the original CNN model architectures. Liu et al [ 29 ] proposed MAG-Res2Net, which explored two DL architectures ResNet [ 30 ] and added the gated module to improve performance multimodal HAR on three simple and more complex public datasets such as UCI-HAR [ 31 ] and WISDM [ 32 ] and leveraged the CSL-SHARE dataset [ 33 ]. MAG-Res2Net, belonging to the time domain, utilizes raw signals directly without undergoing signal transformation via wavelet transformation.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation