2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC) 2021
DOI: 10.1109/icaiic51459.2021.9415213
|View full text |Cite
|
Sign up to set email alerts
|

Predictive Maintenance of Relative Humidity Using Random Forest Method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 19 publications
(5 citation statements)
references
References 2 publications
0
5
0
Order By: Relevance
“…Many RFC based applications have been proposed. RFC was initially employed to predict relative humidity in the smart factory environment and showed 82.49% accuracy, which is considered excellent [ 21 ]. The random forest classification, decision tree classification, gradient boosting classification, and Naive Bayes classification image processing technologies were used to classify the types of rice leaf disease in Thailand.…”
Section: System Description and Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Many RFC based applications have been proposed. RFC was initially employed to predict relative humidity in the smart factory environment and showed 82.49% accuracy, which is considered excellent [ 21 ]. The random forest classification, decision tree classification, gradient boosting classification, and Naive Bayes classification image processing technologies were used to classify the types of rice leaf disease in Thailand.…”
Section: System Description and Methodsmentioning
confidence: 99%
“…Additionally, there are three popular classification algorithms for sleeping posture monitoring: the multilayer perceptron (MLP) algorithm [ 7 , 12 , 15 ], the support vector machine (SVM) algorithm [ 16 , 17 , 18 , 19 ], and random forest classification (RFC) [ 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 , 20 , 21 , 22 ]. An MLP, also known as a feedforward neural network, is an artificial neural network (ANN) with a forward structure.…”
Section: Introductionmentioning
confidence: 99%
“…The tree depth in this application is set to 5 to avoid the interference caused by overtraining. A Random Forest, 30 as shown in Figure 9, is simply an ensemble model composed of a collection of decision trees. The prediction outcomes of each decision tree are averaged to obtain one final outcome.…”
Section: The Proposed Methodsmentioning
confidence: 99%
“…If there are too many trees, the error tends to a limit. We can use some algorithms to find the optimal number of trees for an estimated error and can be used to predict or classify data [35].…”
Section: Random Forestmentioning
confidence: 99%