2019
DOI: 10.1109/tsp.2019.2899818
|View full text |Cite
|
Sign up to set email alerts
|

Symmetric Bilinear Regression for Signal Subgraph Estimation

Abstract: There is increasing interest in learning a set of small outcome-relevant subgraphs in network-predictor regression. The extracted signal subgraphs can greatly improve the interpretation of the association between the network predictor and the response. In brain connectomics, the brain network for an individual corresponds to a set of interconnections among brain regions and there is a strong interest in linking the brain connectome to human cognitive traits. Modern neuroimaging technology allows a very fine se… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

4
4

Authors

Journals

citations
Cited by 26 publications
(19 citation statements)
references
References 32 publications
0
17
0
Order By: Relevance
“…From the result, we can see that a simpler model in general favors predictive accuracy. This discovery is also confirmed by Wang et al (2019), where only a few brain connections are selected to predict traits. In terms of the prediction power, TN-PCA is better than the simple PCA and much better than the CP tensor regression.…”
Section: Resultsmentioning
confidence: 66%
See 1 more Smart Citation
“…From the result, we can see that a simpler model in general favors predictive accuracy. This discovery is also confirmed by Wang et al (2019), where only a few brain connections are selected to predict traits. In terms of the prediction power, TN-PCA is better than the simple PCA and much better than the CP tensor regression.…”
Section: Resultsmentioning
confidence: 66%
“…A few recent variants using Bayesian principles can be found in Guha-niyogi et al, [2017] and Guha and Rodriguez [2018]. Another method to compare is the supervised bi-linear regression (BLR) [Wang et al, 2019] model, which emphasizes signal sub-network selection. In our implementation of BLR, we set the number of components as 10 and selected the L 1 penalty parameter based on cross-validation.…”
Section: Resultsmentioning
confidence: 99%
“…Notice that which may not be negative semi-definite. However, since the diagonal entries of each adjacency matrix are zero, the loss function (4) is actually a convex function of each entry β hu in β h when fixing the others, making the coordinate descent algorithm very appealing ( Wang et al, 2019 ). The technical details of deriving the analytic form update for each parameter are discussed in Appendix A .…”
Section: Methodsmentioning
confidence: 99%
“…Another group of methods finds subsets of vertices, or a subgraph, that contain the most information about certain covariates [9,46,109,110,113]. Estimating signal subgraphs is useful since networks can be extremely large (i.e.…”
Section: Bag Of Networkmentioning
confidence: 99%