2015
DOI: 10.1016/j.ijleo.2015.06.057
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection based on sparse representation with the measures of classification error rate and complexity of boundary

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 19 publications
0
1
0
Order By: Relevance
“…In recent years, it has been widely applied in all kinds of classification and prediction, feature selection and anomaly detection problem [30]. For regression problem, random forest can select feature subset according to mean model fitting error [31]. Since random forest has good feature extraction ability and high precision, this paper adopts random forest to select and extract the key influencing factors of the shield tunneling axis attitude deviation combined recursive feature elimination, the specific process is shown in FIGURE 2.…”
Section: The Key Influencing Factor Analysis About Shield Tunnelmentioning
confidence: 99%
“…In recent years, it has been widely applied in all kinds of classification and prediction, feature selection and anomaly detection problem [30]. For regression problem, random forest can select feature subset according to mean model fitting error [31]. Since random forest has good feature extraction ability and high precision, this paper adopts random forest to select and extract the key influencing factors of the shield tunneling axis attitude deviation combined recursive feature elimination, the specific process is shown in FIGURE 2.…”
Section: The Key Influencing Factor Analysis About Shield Tunnelmentioning
confidence: 99%