2012
DOI: 10.3389/fnsys.2012.00074
|View full text |Cite
|
Sign up to set email alerts
|

Kernel Principal Component Analysis for dimensionality reduction in fMRI-based diagnosis of ADHD

Abstract: This study explored various feature extraction methods for use in automated diagnosis of Attention-Deficit Hyperactivity Disorder (ADHD) from functional Magnetic Resonance Image (fMRI) data. Each participant's data consisted of a resting state fMRI scan as well as phenotypic data (age, gender, handedness, IQ, and site of scanning) from the ADHD-200 dataset. We used machine learning techniques to produce support vector machine (SVM) classifiers that attempted to differentiate between (1) all ADHD patients vs. h… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
47
1
1

Year Published

2013
2013
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 67 publications
(53 citation statements)
references
References 24 publications
0
47
1
1
Order By: Relevance
“…This setback has led to the formulation of kernel-pca- which in brief is the application of PCA in a feature space created through a kernel function (Scholkopf and Smola, 2002). Several studies have recently explored the application of kernel-pca to dimensionality reduction problems in neuroimaging (Rasmussen et al 2012; Sidhu et al 2012; Thirion and Faugeras, 2003; Wang et al 2011). …”
Section: 0 Unsupervised Feature Reduction Techniquesmentioning
confidence: 99%
“…This setback has led to the formulation of kernel-pca- which in brief is the application of PCA in a feature space created through a kernel function (Scholkopf and Smola, 2002). Several studies have recently explored the application of kernel-pca to dimensionality reduction problems in neuroimaging (Rasmussen et al 2012; Sidhu et al 2012; Thirion and Faugeras, 2003; Wang et al 2011). …”
Section: 0 Unsupervised Feature Reduction Techniquesmentioning
confidence: 99%
“…PCA [9] is a mathematical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components. For dimensionality reduction, PCA computes a linear combination of features that have high variance [21] and transforms the original features onto a smaller number of principal components [22]. This is done by finding a linear basis of reduced dimensionality for the data, in which the amount of variance in the data is maximal.…”
Section: Principal Component Analysismentioning
confidence: 99%
“…Principle component analysis (PCA) [9] is a mathematical procedure for solving this problem. PCA transforms the original data onto a smaller number of principal components [18]. It is done by finding a linear basis of reduced dimensionality for the data, which the amount of variance in the data is maximal.…”
Section: Pca and Icamentioning
confidence: 99%