2021
DOI: 10.21203/rs.3.rs-649364/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Novel Ensemble-Based Machine Learning Models Based on The Bagging, Boosting and Random Subspace Methods for Landslide Susceptibility Mapping

Abstract: Indivisual machine learning models show different limitations such as low generalization power for modeling nonlinear phenomena with complex behavior. In recent years, one of the best approaches to this issue is to use ensemble models. The purpose of this paper is to investigate the predictive power and modeling of three novel ensemble models constructed with four machine learning models: Decision Tree (DT), Support Vector Machine (SVM), K-Nearest Neighbors (KNN), Naive Bayes (NB) models based on three approac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
5
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(5 citation statements)
references
References 56 publications
0
5
0
Order By: Relevance
“…Naïve Bayes. NB is a simple and widely used algorithm applied in various fields (computer science, earth sciences, text classification, and medicine) 81 . This approach is practical when sample X can be characterized as conjugating conditionally independent attributes 81,82 .…”
mentioning
confidence: 99%
See 4 more Smart Citations
“…Naïve Bayes. NB is a simple and widely used algorithm applied in various fields (computer science, earth sciences, text classification, and medicine) 81 . This approach is practical when sample X can be characterized as conjugating conditionally independent attributes 81,82 .…”
mentioning
confidence: 99%
“…NB is a simple and widely used algorithm applied in various fields (computer science, earth sciences, text classification, and medicine) 81 . This approach is practical when sample X can be characterized as conjugating conditionally independent attributes 81,82 . Based on Bayesian probability theory, Bayesian learning enables us to compute the posterior probability given the prior chances 83,84 .…”
mentioning
confidence: 99%
See 3 more Smart Citations