2015
DOI: 10.1109/jstsp.2015.2400415
|View full text |Cite
|
Sign up to set email alerts
|

Structured Data Fusion

Abstract: We present structured data fusion (SDF) as a framework for the rapid prototyping of knowledge discovery in one or more possibly incomplete data sets. In SDF, each data set-stored as a dense, sparse, or incomplete tensor-is factorized with a matrix or tensor decomposition. Factorizations can be coupled, or fused, with each other by indicating which factors should be shared between data sets. At the same time, factors may be imposed to have any type of structure that can be constructed as an explicit function of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
192
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 172 publications
(193 citation statements)
references
References 74 publications
(114 reference statements)
1
192
0
Order By: Relevance
“…It is generic in the sense that it describes neuralhemodynamic coupling solely by means of a convolution operation with an initially unknown HRF, and thus minimally makes assumptions on the neural sources of interest, adhering to the idea of blind source separation. Any prior model for the nature of the HRF can be plugged in, as long as it can be expressed as an explicit and differentiable function of some parameters [18]. Complete flexibility can be attained by not imposing any model for h(t), which is conceptually the same as the 'FIR basis set' method [8], while the GLM can be mimicked by fixing h(t) to the canonical HRF.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…It is generic in the sense that it describes neuralhemodynamic coupling solely by means of a convolution operation with an initially unknown HRF, and thus minimally makes assumptions on the neural sources of interest, adhering to the idea of blind source separation. Any prior model for the nature of the HRF can be plugged in, as long as it can be expressed as an explicit and differentiable function of some parameters [18]. Complete flexibility can be attained by not imposing any model for h(t), which is conceptually the same as the 'FIR basis set' method [8], while the GLM can be mimicked by fixing h(t) to the canonical HRF.…”
Section: Discussionmentioning
confidence: 99%
“…To obtain an interpretable result, we impose that the signatures a r and c r are also non-negative, since they describe the time-varying power and spectra of the sources, respectively. Hence, we reformulate the data fusion problem as a structured matrix-tensor factorization [18], in which the factor matrix A = [ a 1 a 2 . .…”
Section: B Structured Matrix-tensor Factorizationmentioning
confidence: 99%
See 1 more Smart Citation
“…And its convergence is also easier to guarantee than the BP network, so the optimal solution can be obtained. Based on the analysis of the structure and learning algorithm of RBF neural networks, a heterogeneous RBF neural network information fusion algorithm in wireless sensor networks is presented [13]. When it is used in heterogeneous information fusion of cluster head or sink node in wireless sensor networks, wireless transmission only needs to transfer the fusion results.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In all these cases, the coupling occurs through equality constraints on latent factors. Following the work in [17], the framework of coupled tensor decompositions was revisited in [5,14], variations on this framework, such as tensor-matrix factorizations [3,1] and more general latent models [18]. Uniqueness properties and the development of algorithms for the exact coupled tensor decomposition problem are proposed in [19] and [20], while algorithms for the coupled tensor approximation problem under general cost functions are developed in [22,10].…”
Section: Introductionmentioning
confidence: 99%