2017
DOI: 10.1007/978-3-319-67777-4_6
|View full text |Cite
|
Sign up to set email alerts
|

Location Dependent Dirichlet Processes

Abstract: Dirichlet processes (DP) are widely applied in Bayesian nonparametric modeling. However, in their basic form they do not directly integrate dependency information among data arising from space and time. In this paper, we propose location dependent Dirichlet processes (LDDP) which incorporate nonparametric Gaussian processes in the DP modeling framework to model such dependencies. We develop the LDDP in the context of mixture modeling, and develop a mean field variational inference algorithm for this mixture mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(10 citation statements)
references
References 21 publications
0
10
0
Order By: Relevance
“…This relaxation smooths the ELBO and helps convergence. Finally, as noted by Sun, Paisley and Liu (2017) in a similar setting, the gamma normalization term presents computational difficulties, and we follow those authors in introducing an auxiliary variable ξ t to further lower bound this term (Sun, Paisley and Liu (2017)),…”
Section: 4mentioning
confidence: 94%
“…This relaxation smooths the ELBO and helps convergence. Finally, as noted by Sun, Paisley and Liu (2017) in a similar setting, the gamma normalization term presents computational difficulties, and we follow those authors in introducing an auxiliary variable ξ t to further lower bound this term (Sun, Paisley and Liu (2017)),…”
Section: 4mentioning
confidence: 94%
“…This relaxation smooths the ELBO and helps convergence. Finally, as noted by [Sun, Paisley and Liu, 2017] in a similar setting, the gamma normalization term presents computational difficulties, and we follow those authors in introducing an auxiliary variable ξ t to further lower bound this term: [Sun, Paisley and Liu, 2017].…”
Section: 4mentioning
confidence: 96%
“…As indicated above, we use a prior P P (•; m, s) parametrized by location and scale (m, s). Previously proposed priors relied on a specific elaborations of the first term of Equation ( 5) which is required to guarantee that the prior probabilities sum to 1 [52,33] and often lead to solve a non-linear system of equations to update the prior probabilities. In contrast, Dirichlet and Logit-Normal distributions are defined on the simplex, therefore appropriate as prior distributions.…”
Section: Segmentation Inference By Expectation-maximizationmentioning
confidence: 99%
“…The Logit-Normal distribution [3] is directly parametrized by location and scale, however the update equations are also non-linear. The Dirichlet distribution was previously used to enforce spatial dependence directly [20] or in combination with a Gaussian process [52]. We propose instead a specific parametrization that has the considerable advantage of leading to a linear regularizing equation, that also satisfies a key requirement of well-calibrated probabilistic inference, i.e.…”
Section: Segmentation Inference By Expectation-maximizationmentioning
confidence: 99%
See 1 more Smart Citation