2022
DOI: 10.7717/peerj-cs.1052
|View full text |Cite
|
Sign up to set email alerts
|

Effects of sliding window variation in the performance of acceleration-based human activity recognition using deep learning models

Abstract: Deep learning (DL) models are very useful for human activity recognition (HAR); these methods present better accuracy for HAR when compared to traditional, among other advantages. DL learns from unlabeled data and extracts features from raw data, as for the case of time-series acceleration. Sliding windows is a feature extraction technique. When used for preprocessing time-series data, it provides an improvement in accuracy, latency, and cost of processing. The time and cost of preprocessing can be beneficial … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 16 publications
(14 citation statements)
references
References 44 publications
(50 reference statements)
1
10
0
Order By: Relevance
“…This may be attributed to the fact that the LSTM model, a nonlinear model that mimics the learning and reasoning process of the human brain, can directly train and predict data, compensating for the information loss caused by the difference method [17] . Furthermore, the sliding window method was used to form a series of overlapping samples, which can effectively utilize the temporal characteristics of the sequence and reduce the time variability caused by abnormal data, which was consistent with the research results of other scholars [18,19] . The influenza sequence in Shanxi Province was affected by the COVID-19, which had complex time characteristics, and the prediction accuracy of a single model was not well.…”
Section: Discussionsupporting
confidence: 81%
“…This may be attributed to the fact that the LSTM model, a nonlinear model that mimics the learning and reasoning process of the human brain, can directly train and predict data, compensating for the information loss caused by the difference method [17] . Furthermore, the sliding window method was used to form a series of overlapping samples, which can effectively utilize the temporal characteristics of the sequence and reduce the time variability caused by abnormal data, which was consistent with the research results of other scholars [18,19] . The influenza sequence in Shanxi Province was affected by the COVID-19, which had complex time characteristics, and the prediction accuracy of a single model was not well.…”
Section: Discussionsupporting
confidence: 81%
“…This approach enabled the inference of smaller subsegments of fascicles of arbitrary lengths. Although the sliding window is a very popular technique for training processes on CNN [ 42 , 43 ], the novelty of this research is the combination of the approach to multiple LSTM units for fascicle length extraction. While the proposed model was trained on fixed-length sequences, no a priori information was provided in input to the model on the length of the inputs, suggesting that the model would successfully deal with varying-length inputs and maintain good performances across participants and tasks.…”
Section: Discussionmentioning
confidence: 99%
“…Utilizing the overlapping sliding window technique for preprocessing low-dimensional time series data results in reduced latency, improved Accuracy, and lower processing costs, particularly when employing a smaller window size [54]. In order to improve the Accuracy, reduce the feature extraction time, and give more comprehensive user data so the model can learn from them, each user signal was split into small segments in this study using the overlapping sliding window technique.…”
Section: Preprocessingmentioning
confidence: 99%