2019 15th International Wireless Communications &Amp; Mobile Computing Conference (IWCMC) 2019
DOI: 10.1109/iwcmc.2019.8766544
|View full text |Cite
|
Sign up to set email alerts
|

Important Complexity Reduction of Random Forest in Multi-Classification Problem

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 33 publications
(23 citation statements)
references
References 14 publications
0
20
0
Order By: Relevance
“…73 The RF is characterized by a low number of control where the complexity of the prediction phase is O(set × tree) where set is the feature set size and tree is the number of trees. 74…”
Section: Random Forest (Rf)mentioning
confidence: 99%
“…73 The RF is characterized by a low number of control where the complexity of the prediction phase is O(set × tree) where set is the feature set size and tree is the number of trees. 74…”
Section: Random Forest (Rf)mentioning
confidence: 99%
“…For the task of Intrusion Detection [2,3,18,70,71,80], a variety of different approaches have been used in the literature including Ensemble methods [53], Feature Selection [52], Fuzzy Neural Networks [13], Kernel Methods [67], Random Forests [21], and deep [25]. However, we refrain from comparing with these approaches as they do not process the data in a streaming manner and typically require large amount of labelled training data, whereas we process the data in an unsupervised and online manner.…”
Section: Related Workmentioning
confidence: 99%
“…Complexity may refer to the time required to build the model, the computer memory consumed, or the amount of time a program runs until a result is obtained. The complexity of training the random forest classifier is O (M•m•n•log(n)), where M is the number of decision trees in the random forest, m is the number of variables, and n is the number of samples in the training set [16]. This means that reducing the number of input parameters will shorten training time by half.…”
Section: Machine Learning Modelsmentioning
confidence: 99%
“…Figure 1 illustrates the block diagram of the recursive algorithm allowing the development of the models. number of variables, and n is the number of samples in the training set [16]. This mean that reducing the number of input parameters will shorten training time by half.…”
Section: Machine Learning Modelsmentioning
confidence: 99%