2019
DOI: 10.1016/j.cam.2018.04.022
|View full text |Cite
|
Sign up to set email alerts
|

Multi-variable regression methods using modified Chebyshev polynomials of class 2

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(14 citation statements)
references
References 20 publications
0
14
0
Order By: Relevance
“…0.3629) is achieved by MIFS-CR+CP-2. This high performance could be ascribed to three aspects: Firstly, given a proper order, any nonlinear model could be approximated by an orthogonal polynomials; Secondly, compared with the NN-based model, a specific mathematical equation could be deducted by the polynomials-based model; Thirdly, according to the theory of PCPR, the basic idea of PCPR is to enhance the efficiency with the cost of accuracy [12] and therefore the prediction error is slightly larger than feature-selection-based MCPR method (i.e. MIFS-CR+CP-2).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…0.3629) is achieved by MIFS-CR+CP-2. This high performance could be ascribed to three aspects: Firstly, given a proper order, any nonlinear model could be approximated by an orthogonal polynomials; Secondly, compared with the NN-based model, a specific mathematical equation could be deducted by the polynomials-based model; Thirdly, according to the theory of PCPR, the basic idea of PCPR is to enhance the efficiency with the cost of accuracy [12] and therefore the prediction error is slightly larger than feature-selection-based MCPR method (i.e. MIFS-CR+CP-2).…”
Section: Resultsmentioning
confidence: 99%
“…However, the number of regressed coefficients of MCPR is exponentially proportion to the number of input variables. To overcome the coefficient explosion problem, in our work [12] , two strategies including feature selection and cascaded regression are designed to alleviate this problem. In terms of feature selection idea, a novel feature selection method [13] named "Mutual Information-based Feature Selection with Class-dependent Redundancy" (MIFS-CR) are combined with MCPR to directly reduce the number of the input variables of MCPR.…”
Section: Introduction To Regression Methodsmentioning
confidence: 99%
“…The limitation of our regression methods is draw as:The manifold optimization might not be stable in the case of multi-variable function exploitation.The proposed regression methods mentioned in this paper are not suitable for the classification problem.The kernel trick employed in recent regression studies could not be used in the proposed regression methods. Based on the research (Li et al. , 2019), we had tried to apply the kernel trick to PCPR by using the idea of K-PLSR (Rosipal and Trejo, 2001).…”
Section: Discussionmentioning
confidence: 99%
“…, 2014; Zhou et al. , 2015) was adopted to simplify a multi-variable regression problem by using the combination of several single (or bi-, tri-) variable regression problems instead of a unified multi-variable problem (more specific information in (Li et al. , 2019)).…”
Section: Introductionmentioning
confidence: 99%
“…Chebyshev polynomials have become very important in numerical analysis. They are widely used because of their advantages, such as the roots of the first kind of Chebyshev polynomials (Gauss-Lobatto nodes) being used in polynomial interpolation for minimizing the Runge phenomena, providing the best uniform approximation of polynomials in continuous functions (see [35][36][37]). Most commonly used techniques with Chebyshev polynomials have been examined in [38][39][40] and the references therein.…”
Section: Introductionmentioning
confidence: 99%