1994
DOI: 10.1002/cem.1180080204
|View full text |Cite
|
Sign up to set email alerts
|

A PLS kernel algorithm for data sets with many variables and fewer objects. Part 1: Theory and algorithm

Abstract: SUMMARYA fast PLS regression algorithm dealing with large data matrices with many variables (K) and fewer objects (N) is presented. For such data matrices the classical algorithm is computer-intensive and memory-demanding. Recently, Lindgren et al. (J. Chemometrics, 7,45-49 (1993)) developed a quick and efficient kernel algorithm for the case with many objects and few variables. The present paper is focused on the opposite case, i.e. many variables and fewer objects. A kernel algorithm is presented based on ei… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
148
0
3

Year Published

1998
1998
2013
2013

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 282 publications
(152 citation statements)
references
References 15 publications
(3 reference statements)
1
148
0
3
Order By: Relevance
“…70 3D-QSAR CoMSIA Model. The PLS analysis implemented in SYBYL 8.0 71,72 was employed to obtain correlation between the CoMSIA similarity fields, which were used as independent explanatory variables, and the pIC 50 values which were used as target dependent variables. PLS was performed in two stages.…”
Section: ■ Materials and Methodsmentioning
confidence: 99%
“…70 3D-QSAR CoMSIA Model. The PLS analysis implemented in SYBYL 8.0 71,72 was employed to obtain correlation between the CoMSIA similarity fields, which were used as independent explanatory variables, and the pIC 50 values which were used as target dependent variables. PLS was performed in two stages.…”
Section: ■ Materials and Methodsmentioning
confidence: 99%
“…The best grid spacing results were obtained with a 1.0 Å lattice spacing. Such late value was considered an optimum value since higher precision in evaluating the 3D field by introducing a finer grid would have resulted in an increase of so-called "brown noise" caused by the grid size sensitivity of the statistical technique used for generating the models [36].…”
Section: Alignment Rulementioning
confidence: 99%
“…The main difference between the two methods is that, PLSR uses the information available from both the response and predictor variables when projecting into the pdimensional subspace, which entails that PLSR is based on a supervised dimension reduction technique. The kernelised version of the PLSR algorithm follows from the algorithm presented in Ränner et al [34]. This results in the following estimate of the Kalman gain matrix:…”
Section: Kernel Principal Component Regressionmentioning
confidence: 99%
“…Ränner et al [34] outline an efficient procedure for solving this problem when n x and n d are larger than n e . This gives the following EnKF updating scheme based on kernel PLSR:…”
Section: Kernel Principal Component Regressionmentioning
confidence: 99%