2016
DOI: 10.3389/fgene.2016.00102
|View full text |Cite
|
Sign up to set email alerts
|

Random Projection for Fast and Efficient Multivariate Correlation Analysis of High-Dimensional Data: A New Approach

Abstract: In recent years, the advent of great technological advances has produced a wealth of very high-dimensional data, and combining high-dimensional information from multiple sources is becoming increasingly important in an extending range of scientific disciplines. Partial Least Squares Correlation (PLSC) is a frequently used method for multivariate multimodal data integration. It is, however, computationally expensive in applications involving large numbers of variables, as required, for example, in genetic neuro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 66 publications
0
5
0
Order By: Relevance
“…The bounds in Table 1 only give qualitative guidance about the embedding probability. Users will benefit from more prescriptive results in order to choose the sketch size k, and the type of sketch for applications (Grellmann et al 2016;Geppert et al 2017;Ahfock et al 2020;Falcone et al 2021).…”
Section: Sketching Algorithmsmentioning
confidence: 99%
“…The bounds in Table 1 only give qualitative guidance about the embedding probability. Users will benefit from more prescriptive results in order to choose the sketch size k, and the type of sketch for applications (Grellmann et al 2016;Geppert et al 2017;Ahfock et al 2020;Falcone et al 2021).…”
Section: Sketching Algorithmsmentioning
confidence: 99%
“…As we can observe, the proposed DT-RP algorithm outperforms RP for all datasets. The average Recall 4 Note that the recall metric used here is different from the common classification recall (sensitivity, true positive rate) commonly used in categorization problems together with the precision score. For more details about the recall score used in approximate nearest neighbor search see [40].…”
Section: Recall On Different Datasetsmentioning
confidence: 99%
“…Thanks to this property, Random Projection has become a widespread tool for dimensionality reduction, especially in large-scale applications where the volume of data or the dimensionality of samples is too big for alternative methods. For instance, Random Projection has been successfully used to accelerate tasks such as multivariate correlation analysis [4], high-dimensional data clustering [5,6], image search [7] or texture classification [8], among many others.…”
Section: Introductionmentioning
confidence: 99%
“…Broadly speaking, random projections offer a universal and flexible approach to complex statistical problems. They are a particularly useful tool in large‐scale settings, such as high‐dimensional classification (Cannings & Samworth, 2017; Durrant & Kabán, 2013, 2015), clustering (Dasgupta, 1999; Fern & Brodley, 2003; Heckel, Tschannen, & Bölcskei, 2017), precision matrix estimation (Marzetta, Tucci, & Simon, 2011), regression (Ahfock, Astle, & Richardson, 2017; Dobriban & Liu, 2019; Heinze, McWiliams, & Meinshausen, 2016; Klanke, Vijayakumar, & Schaal, 2008; McWilliams, Heinze, Meinshausen, Krummenacher, & Vanchinathan, 2014; Mukhopadhyay & Dunson, 2019; Slawski, 2018; Thanei, Heinze, & Meinshausen, 2017; Thanei, Meinshausen, & Shah, 2018), sparse principal component analysis (Gataric, Wang, & Samworth, 2019), hypothesis testing (Lopes, Jacob, & Wainwright, 2011; Shi, Lu, & Song, 2019), correlation estimation (Grellmann et al, 2016), dimension reduction (Bingham & Mannilla, 2001; Omidiran & Wainwright, 2010; Reeve, Mu, & Brown, 2018), and matrix decomposition (Halko, Martinsson, & Tropp, 2011).…”
Section: Introductionmentioning
confidence: 99%
“…, sparse principal component analysis (Gataric, Wang, & Samworth, 2019), hypothesis testing (Lopes, Jacob, & Wainwright, 2011;Shi, Lu, & Song, 2019), correlation estimation (Grellmann et al, 2016), dimension reduction (Bingham & Mannilla, 2001;Omidiran & Wainwright, 2010;Reeve, Mu, & Brown, 2018), and matrix decomposition (Halko, Martinsson, & Tropp, 2011).…”
mentioning
confidence: 99%