2021
DOI: 10.3389/frai.2021.674166
|View full text |Cite
|
Sign up to set email alerts
|

A Matrix-Variate t Model for Networks

Abstract: Networks represent a useful tool to describe relationships among financial firms and network analysis has been extensively used in recent years to study financial connectedness. An aspect, which is often neglected, is that network observations come with errors from different sources, such as estimation and measurement errors, thus a proper statistical treatment of the data is needed before network analysis can be performed. We show that node centrality measures can be heavily affected by random errors and prop… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 33 publications
0
1
0
Order By: Relevance
“…Also, in our non-parametric setting, we robustify the Granger causality-in-distribution test ex-post by integrating over the kernels and truncation parameters. An alternative approach on how to deal with biases in network metrics due to edge estimation and measurement errors could be the de-noising procedure proposed by Billio et al (2021b). Finally, it is interesting to apply our framework when comparing estimated versus physical data, following the taxonomy of "informational"…”
Section: Discussionmentioning
confidence: 99%
“…Also, in our non-parametric setting, we robustify the Granger causality-in-distribution test ex-post by integrating over the kernels and truncation parameters. An alternative approach on how to deal with biases in network metrics due to edge estimation and measurement errors could be the de-noising procedure proposed by Billio et al (2021b). Finally, it is interesting to apply our framework when comparing estimated versus physical data, following the taxonomy of "informational"…”
Section: Discussionmentioning
confidence: 99%