2013
DOI: 10.1016/j.neunet.2012.11.004
|View full text |Cite
|
Sign up to set email alerts
|

Learning a common substructure of multiple graphical Gaussian models

Abstract: Properties of data are frequently seen to vary depending on the sampled situations, which usually changes along a time evolution or owing to environmental effects. One way to analyze such data is to find invariances, or representative features kept constant over changes. The aim of this paper is to identify one such feature, namely interactions or dependencies among variables that are common across multiple datasets collected under different conditions. To that end, we propose a common substructure learning (C… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
17
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 24 publications
(17 citation statements)
references
References 41 publications
0
17
0
Order By: Relevance
“…To reflect these two points in a sophisticated manner, we introduce a technique proposed in graphical Gaussian modeling (GGM), which is a research field of data mining. The technique involves efficiently decomposing a set of precision matrices, which are inverse covariance matrices, into their common invariant elements and individually deviated elements [6,9]. Because these techniques are dedicated to the precision matrices, which are typically sparse and positive semi-definite (PSD), it is not applicable to our absolute density matrices, which are often dense and not limited to being PSD.…”
Section: B Proposed Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…To reflect these two points in a sophisticated manner, we introduce a technique proposed in graphical Gaussian modeling (GGM), which is a research field of data mining. The technique involves efficiently decomposing a set of precision matrices, which are inverse covariance matrices, into their common invariant elements and individually deviated elements [6,9]. Because these techniques are dedicated to the precision matrices, which are typically sparse and positive semi-definite (PSD), it is not applicable to our absolute density matrices, which are often dense and not limited to being PSD.…”
Section: B Proposed Methodsmentioning
confidence: 99%
“…Here, we describe an overview of the algorithm, which works up to several qubits or for matrices of size around hundreds. Similar to our previous study [6], we do not work on the problem (2) directly but work on the dual problem instead:…”
Section: B Proposed Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…There are some prior works [6,13,14,20,24] on learning multiple precision matrices simultaneously from multiple different but related sets of observations. All these methods assume that the jointly learned precision matrices (graphs) should share the similar structure.…”
Section: Related Workmentioning
confidence: 99%
“…All these methods assume that the jointly learned precision matrices (graphs) should share the similar structure. For example, [13] proposed a method to learn common substructures among multiple graphs. [6] used ADMM to estimate multiple precision matrices with pairwise fused lasso penalty and group lasso penalty.…”
Section: Related Workmentioning
confidence: 99%