2009
DOI: 10.1016/j.chemolab.2009.05.008
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive weighted least square support vector machine regression integrated with outlier detection and its application in QSAR

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 57 publications
(21 citation statements)
references
References 7 publications
0
21
0
Order By: Relevance
“…6. With a correlation coefficient of 0.82 and a cross-validated correlation coefficient of 0.74, the established QSAR model was acceptable (64). This model was used to predict the Ki values for 14 compounds among the top 20 in silico synthesized compounds, as listed in Table II.…”
Section: Resultsmentioning
confidence: 97%
“…6. With a correlation coefficient of 0.82 and a cross-validated correlation coefficient of 0.74, the established QSAR model was acceptable (64). This model was used to predict the Ki values for 14 compounds among the top 20 in silico synthesized compounds, as listed in Table II.…”
Section: Resultsmentioning
confidence: 97%
“…In the second stage, the remaining data were used to train SVRs. For example, Cui and Yan (2009) developed an adaptive weighted LS-SVR, which first used the robust 3σ principle to remove outliers in the training data set and then a weighted LS-SVR was applied to the remaining samples. Wen et al (2010) presented a recursive training-eliminating procedure for robust regression, where sample points simulated by LS-SVR with large errors were recursively removed until the sum of a certain number of sorted absolute errors did not increase.…”
Section: Introductionmentioning
confidence: 99%
“…Support vector machines (SVM) introduced by Vapnik are a new methodology in the area of nonlinear modeling after neural networks. While neural networks suffer from problems like the existence of many local minima and the choice of the number of hidden units [6], SVM are characterized by convex optimization problems based on sound theoretical principles, up to the determination of a few additional tuning parameters, and provide better generalization performance than that of neural networks [7,8]. The convex quadratic programming problem is solved in dual space in order to determine the SVM model.…”
Section: Introductionmentioning
confidence: 99%