2015
DOI: 10.1007/978-3-319-24598-0_3
|View full text |Cite
|
Sign up to set email alerts
|

Regularized Multivariate von Mises Distribution

Abstract: Abstract. Regularization is necessary to avoid overfitting when the number of data samples is low compared to the number of parameters of the model. In this paper, we introduce a flexible L 1 regularization for the multivariate von Mises distribution. We also propose a circular distance that can be used to estimate the Kullback-Leibler divergence between two circular distributions by means of sampling, and also serves as goodness-of-fit measure. We compare the models on synthetic data and real morphological da… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

2
2
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 8 publications
2
2
0
Order By: Relevance
“…The results in this paper extend those in Rodriguez‐Lujan et al . The added contributions of the present paper are a redefinition of a penalization term for learning the parameters of the multivariate conditional distributions, a brief proof of consistence of the penalized estimator, and the study of its bias and variance properties through numerical experiments.…”
Section: Introductionsupporting
confidence: 74%
See 1 more Smart Citation
“…The results in this paper extend those in Rodriguez‐Lujan et al . The added contributions of the present paper are a redefinition of a penalization term for learning the parameters of the multivariate conditional distributions, a brief proof of consistence of the penalized estimator, and the study of its bias and variance properties through numerical experiments.…”
Section: Introductionsupporting
confidence: 74%
“…In the linear case, for the multivariate normal distribution, this structureaware penalization paradigm has been applied to learn graphical models with hubs 18 or to penalize according to some defined distance between the variables. 19 The results in this paper extend those in Rodriguez-Lujan et al 20 The added contributions of the present paper are a redefinition of a penalization term for learning the parameters of the multivariate conditional distributions, a brief proof of consistence of the penalized estimator, and the study of its bias and variance properties through numerical experiments. The application to real-world data in neuronatomy is extended to include new data sets from different species rather than focus exclusively on human neurons.…”
Section: Introductionsupporting
confidence: 72%
“…No simple closed analytic form is generally available for the normalising constant of the sine multivariate von Mises model of Mardia et al (2008), but its conditional distributions are vM and thus its parameters can be estimated by maximising the pseudo-likelihood. Conditions on its parameters to ensure unimodality were established in Mardia and Voss (2014) and pseudo-likelihood regularised approaches were given in Rodriguez-Lujan et al (2015. A multivariate extension of the second-order generalised vM (GvM 2 ) distribution (with m = 2 in (7)), obtained by conditioning a multivariate Gaussian distribution on R 2d to T d , was proposed by Navarro et al (2017).…”
Section: Models For Toroidal Datamentioning
confidence: 99%
“…No simple closed analytic form is generally available for the normalizing constant of the sine multivariate von Mises (MvM) model of Mardia et al (2008), but its conditional distributions are vM and its parameters can be estimated by maximizing the pseudo-likelihood. Conditions on its parameters to ensure unimodality were established in Mardia and Voss (2014) and pseudo-likelihood regularized approaches were given in Rodriguez-Lujan et al (2015. A multivariate extension of the second-order GvM (GvM 2 ) distribution (see Section 3.1), obtained by conditioning a multivariate Gaussian distribution on R 2d to T d , was proposed by Navarro et al (2017).…”
Section: Models For Toroidal Datamentioning
confidence: 99%