Appl.Math. 2020
DOI: 10.21136/am.2020.0023-20
|View full text |Cite
|
Sign up to set email alerts
|

Block matrix approximation via entropy loss function

Abstract: The aim of the paper is to present a procedure for the approximation of a symmetric positive definite matrix by symmetric block partitioned matrices with structured off-diagonal blocks. The entropy loss function is chosen as approximation criterion. This procedure is applied in a simulation study of the statistical problem of covariance structure identification.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…The identification procedure can be also used for a non-linear structure -such as the autoregressive structure of order one; cf. Filipiak et al (2021), Janiszewska et al (2020).…”
Section: Discussionmentioning
confidence: 99%
“…The identification procedure can be also used for a non-linear structure -such as the autoregressive structure of order one; cf. Filipiak et al (2021), Janiszewska et al (2020).…”
Section: Discussionmentioning
confidence: 99%
“…The entropy loss function was used in the paper Janiszewska et al (2020) to identify covariance structure, where consideration was given to a block covariance structure with off-diagonal blocks corresponding to the off-diagonal blocks of CS and AR(1) structures. The entropy loss function is also known as Kullback-Leibler divergence between two probability distributions; cf.…”
Section: Structure Identification Methodsmentioning
confidence: 99%
“…In the literature, some subblocks of the structures given in the previous sections have been considered. In the paper Janiszewska et al (2020), the authors study the following structure:…”
Section: Block Multivariate Modelling Approachmentioning
confidence: 99%
See 2 more Smart Citations