2019
DOI: 10.1109/tmi.2018.2885968
|View full text |Cite
|
Sign up to set email alerts
|

Connectome smoothing via low-rank approximations

Abstract: In brain imaging and connectomics, the study of brain networks, estimating the mean of a population of graphs based on a sample is a core problem. Often, this problem is especially difficult because the sample or cohort size is relatively small, sometimes even a single subject, while the number of nodes can be very large with noisy estimates of connectivity. While the element-wise sample mean of the adjacency matrices is a common approach, this method does not exploit underlying structural properties of the gr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 32 publications
(35 citation statements)
references
References 43 publications
(84 reference statements)
0
31
0
Order By: Relevance
“…It has been suggested that the information encoded in patterns of brain connectivity can uniquely identify different subjects [49], [50], and there is some evidence of low-rank structure in those differences [51], [52]. We study how lowdimensional representations can capture inter-individual variability by using the HNU1 data to classify subject scans.…”
Section: Real Data Experiments 1: Subject Classification On Hnu1 Datamentioning
confidence: 99%
See 1 more Smart Citation
“…It has been suggested that the information encoded in patterns of brain connectivity can uniquely identify different subjects [49], [50], and there is some evidence of low-rank structure in those differences [51], [52]. We study how lowdimensional representations can capture inter-individual variability by using the HNU1 data to classify subject scans.…”
Section: Real Data Experiments 1: Subject Classification On Hnu1 Datamentioning
confidence: 99%
“…First, we consider clustering of each graph separately by doing ASE on the English graph A en (ASE+EN), or equivalently, JE on A en , and ASE on the French Graph A f r (ASE+FR), and compare with the individual latent positions obtained by JEĤD 1 2 en (JE+EN) andĤD 1 2 f r (JE+FR). We also consider methods to estimate joint latent positions, by doing ASE on the mean of both graphsĀ = (A en + A f r )/2 (ASE+(EN+FR)) [51], and the matrixĤ obtained by JE on both graphs (JE+ (EN,FR)). The dimension d is set to 3 for all approaches, and the latent positions are scaled to have norm 1 for degree correction.…”
Section: Real Data Experiments 3: Joint Embedding To Cluster Verticesmentioning
confidence: 99%
“…Random Dot Product Graphs (RDPGs) are a class of Latent Position Models [14] developed to analyse social networks [31,47], and then extended to many other applications and types of networks [3,9,25,37,43]. To describe interactions in a network, such models assume that the probability of observing an interaction between two nodes is a function of the nodes' features [31,47].…”
Section: The Random Dot Product Graph Modelmentioning
confidence: 99%
“…Block Models-based approaches, such as the probabilistic generative family of Stochastic Block Models (and variants), aggregate nodes into groups based on their similarity of interactions [15,45]. Graph embedding methods on the other hand rely on projecting nodes onto an abstract latent feature space, so that the interaction probabilities depend on these latent features [2,5,43].…”
Section: Introductionmentioning
confidence: 99%
“…Diagonal augmentation (diag-aug) is a method for imputing the diagonals of adjacency matrices from graphs with no self-loops [66,87,101]. The diagonals are imputed with with the average of the non-diagonal entries of each row, which corresponds to the degree of each vertex divided by n − 1.…”
Section: Diagonal Augmentationmentioning
confidence: 99%