2010
DOI: 10.4018/978-1-60566-766-9
|View full text |Cite
|
Sign up to set email alerts
|

Handbook of Research on Machine Learning Applications and Trends

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0
1

Year Published

2011
2011
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 145 publications
(14 citation statements)
references
References 0 publications
0
13
0
1
Order By: Relevance
“…Since Pearson [35] first invented and defined PCA through approximating multivariate distributions by lines and planes in 1901, researchers have defined PCA from different aspects [36], [37]. Among these definitions, using covariance matrix of the training samples to define PCA is very popular in pattern recognition and machine learning community.…”
Section: Review Of the Pcamentioning
confidence: 99%
“…Since Pearson [35] first invented and defined PCA through approximating multivariate distributions by lines and planes in 1901, researchers have defined PCA from different aspects [36], [37]. Among these definitions, using covariance matrix of the training samples to define PCA is very popular in pattern recognition and machine learning community.…”
Section: Review Of the Pcamentioning
confidence: 99%
“…Transfer learning is another area of important research activity. The goal of transfer learning is to improve learning in the target learning task by leveraging knowledge from an existing source task (9). Given challenges in obtaining sufficient data for target Patient Derived Xenografts (PDXs), where tumors are grown in mouse host animals, ongoing transfer learning work holds promise for learning on cell lines as a source for the target PDX model predictions.…”
Section: Ai and Large-scale Computing To Predict Tumor Treatment Respmentioning
confidence: 99%
“…Using such a maximum likelihood estimate for σ, and assuming further that the dimension p of the Gibbs manifold is fixed from the outset, the remaining optimization of (the orientation of) G reduces to maximizing the weighted average of the relative entropies {S(πμ G (µ i ) μ)}. In the Gaussian regime, this is tantamount to the optimization task known in statistics as "principal component analysis" [33][34][35][36][37][38][39][40]. Now I turn to the general case in which there is an arbitrary given reference state, and where both the dimension and the orientation of the explanatory level of description are to be inferred.…”
Section: Assessing Thermalizationmentioning
confidence: 99%
“…In such a generic setting, the task is to infer the optimal dimension and orientation of the lower-dimensional latent space. Problems of this type can be tackled with a variety of statistical techniques such as factor analysis or principal component analysis [33][34][35][36][37][38][39][40]. In the present paper, I shall build on these generic techniques to develop a statistical framework tailored to the relevant task of assessing whether or not thermalization has occurred, and if so, inferring the most plausible set of constants of the motion.…”
Section: Introductionmentioning
confidence: 99%