2022
DOI: 10.48129/kjs.splml.19189
|View full text |Cite
|
Sign up to set email alerts
|

Performance evaluation of machine learning based voting classifier system for human activity recognition

Abstract: In the last few decades, Human Activity Recognition (HAR) has been a centre of attraction in many research domains, and it is referred to as the potential of interpreting human body gestures through sensors and ascertaining the activity of a human being. The present work has proposed the voting classifier system for human activity recognition. For the voting classifier system, five machine learning classifiers are considered: Logistic Regression (LR), K-Nearest Neighbour (KNN), Random Forest (RF), Naive Bayes … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 26 publications
(30 reference statements)
0
4
0
Order By: Relevance
“…In our assessment, we scrutinize nine distinct machine learning models, alongside the online algorithm introduced in [1]. For brevity, each method is denoted by an acronym: Online Algorithm (ONL), Random Forest (RF) [18], Linear Regression (LR) [19], k-Nearest Neighbors (KNN) [20], Decision Tree (DCT) [21], Support Vector Machine with RBF (SVM-R) [22], and Sigmoid (SVM-S) kernels [23], and committee algorithms such as Soft Voting Standard (SV) [24] and Hard Voting (HV) [25]. We harness the Sklearn library for implementing the machine learning models, adopting standard parameters.…”
Section: Framework Definitionsmentioning
confidence: 99%
“…In our assessment, we scrutinize nine distinct machine learning models, alongside the online algorithm introduced in [1]. For brevity, each method is denoted by an acronym: Online Algorithm (ONL), Random Forest (RF) [18], Linear Regression (LR) [19], k-Nearest Neighbors (KNN) [20], Decision Tree (DCT) [21], Support Vector Machine with RBF (SVM-R) [22], and Sigmoid (SVM-S) kernels [23], and committee algorithms such as Soft Voting Standard (SV) [24] and Hard Voting (HV) [25]. We harness the Sklearn library for implementing the machine learning models, adopting standard parameters.…”
Section: Framework Definitionsmentioning
confidence: 99%
“…Therefore, two neighboring intervals constitute different classes. This algorithm has two main phases: the training phase and the classification phase [17].…”
Section: Literature Reviewmentioning
confidence: 99%
“…We also showed the method of classification adopted in the treatment. [25] SVM-Gaussian kerne 96.50% De Leonardis et al [26] K-nearest neighbor -feedforward neural network -SVM -decision tree -Naïve Bayes 97.00% Nurhanim et al [27] SVM polynomial kernel -one versus all 98.57% Agarwal and Alam [28] SVMs-k-nearest neighbor-linear discriminant analysis 98.00% Minarno et al [29] SVM+LR 98.00% Jindal et al [30] SVM, KNN, and LR 92.78% Patel and Shah [31] Long short-term, LR 92.00% Navita and Mittal [32] SVM 98.03% Figure 6. Model accuracy scores…”
Section: Machine Learning Model Evaluationmentioning
confidence: 99%