2018
DOI: 10.1007/s11280-018-0637-3
|View full text |Cite
|
Sign up to set email alerts
|

Group sparse reduced rank regression for neuroimaging genetic study

Abstract: The neuroimaging genetic study usually needs to deal with high dimensionality of both brain imaging data and genetic data, so that often resulting in the issue of curse of dimensionality. In this paper, we propose a group sparse reduced rank regression model to take the relations of both the phenotypes and the genotypes for the neuroimaging genetic study. Specifically, we propose designing a graph sparsity constraint as well as a reduced rank constraint to simultaneously conduct subspace learning and feature s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 53 publications
0
4
0
Order By: Relevance
“…As the bioclimatic variable used was air temperature, we used the model relating air temperature and water temperature proposed by [50] to estimate the latter in each grid cell. The lethal maximum temperature for each species was obtained from [51–53].…”
Section: Methodsmentioning
confidence: 99%
“…As the bioclimatic variable used was air temperature, we used the model relating air temperature and water temperature proposed by [50] to estimate the latter in each grid cell. The lethal maximum temperature for each species was obtained from [51–53].…”
Section: Methodsmentioning
confidence: 99%
“…RRR is a computationally efficient method which increases statistical power in settings where the number of dimensions is large compared to the number of examples. In such m ≫ j settings RRR is nowadays a state-of-the-art method in fields with high dimensional data such as genetics and imaging 21,22 . Given j observations of m predictors and n outcome measures, standard multivariate regression requires fitting m · n coefficients Y = XC + E , with Y being the response matrix of size j × n, X being the j × m predictor matrix, C the m × n coefficient matrix and E being the error term matrix of size j × n .…”
Section: Methodsmentioning
confidence: 99%
“…RRR is a computationally efficient method which increases statistical power in settings where the number of dimensions is large compared to the number of examples. In such 𝑚 ≫ 𝑗 settings RRR is nowadays a state-of-the-art method in fields with high dimensional data such as genetics and imaging 21,22 This decomposition allows for interpretations of 𝐴 and 𝐵. 𝐴 is a mapping from the predictor matrix 𝑋 to a latent representation of dimension 𝑘. 𝐵 is a mapping from the latent scores to the responses 𝑌.…”
Section: Reduced Rank Regression Model (Rrr)mentioning
confidence: 99%
“…RRR is a computationally efficient method which increases statistical power in settings where the number of dimensions is large compared to the number of examples. In such m ≫ j settings RRR is nowadays a state-of-the-art method in fields with high dimensional data such as genetics and imaging (19,20) The latent scores XA display the low-dimensional predictor variability that is predictive of the response variability.…”
Section: Reduced Rank Regression Modelmentioning
confidence: 99%