2021
DOI: 10.1016/j.neunet.2021.01.024
|View full text |Cite
|
Sign up to set email alerts
|

The exact asymptotic form of Bayesian generalization error in latent Dirichlet allocation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 56 publications
0
8
0
Order By: Relevance
“…The generalization loss G n (π), the leave-one-out cross validation C n (π), the widely applicable information criterion W n (π), the hold-out cross validation H n (π) , and the adjusted cross validation A n (π) can also be defined by the same eqs. ( 11), ( 16), (17). (22), and (23), respectively.…”
Section: Evaluation Of Prior Distributionsmentioning
confidence: 99%
See 1 more Smart Citation
“…The generalization loss G n (π), the leave-one-out cross validation C n (π), the widely applicable information criterion W n (π), the hold-out cross validation H n (π) , and the adjusted cross validation A n (π) can also be defined by the same eqs. ( 11), ( 16), (17). (22), and (23), respectively.…”
Section: Evaluation Of Prior Distributionsmentioning
confidence: 99%
“…The concrete values of RLCTs of singular statistical models have been clarified, for example, reduced rank regressions [4], neural networks [5,31], normal mixtures [44], Poisson mixtures [24], Latent Dirichlet allocations [17], and multinomial mixtures [42]. It is also clarified that, based on RLCT, the exchange probability of exchange MCMC methods can be optimally designed [20].…”
Section: Real Log Canonical Thresholdmentioning
confidence: 99%
“…These properties are helpful to study RCLTs of statistical models and learning machines. In fact, RLCTs of important statistical models and learning machines were found by developing resolution procedures in neural networks [8,44], normal mixtures [60], Poisson mixtures [35], multinomial mixtures [58], general and nonnegative matrix facorizations [6,21], Boltzmann machines [7,62], hidden and general Markov models [61,67] and latent Dirichlet allocations [22]. Note that singularities in statistical models and learning machines make the free energy and the generalization loss smaller if Bayesian inferences are employed, hence almost all learning machines are singular [47] and that's good [59].…”
Section: Assume That λmentioning
confidence: 99%
“…Such nonidentifiable and singular models and machines are not special but ubiquitous in modern statistics and machine learning. For example, neural networks and deep learning [8,43,44] have hierarchical structures, normal mixtures [24,60], Poisson mixtures [35], multinomial mixtures [58], and latent Dirichlet allocations [22] have latent or hidden variables, and matrix factorizations [6,21], Boltzmann machines [7,62], and Markov models [61,67] have both hidden and hierarchical parts. In other words, almost all statistical models and learning machines which extract hidden structures or hierarchical inferences are nonidentifiable and singular [47].…”
Section: Introductionmentioning
confidence: 99%
“…In general, the parameter set K(w) = 0 contains complicated singularities, hence it is difficult to find the resolution map, however, both RLCTs and multiplicities have been clarified in several statistical models and learning machines. Examples of the models in which the RLCTs are found include normal mixtures [10], Poisson mixtures [11], Bernoulli mixtures [12], rank regression [10], Latent Dirichlet Allocation (LDA) [13], and so on. In addition, the RLCTs are used as an analysis of the exchange rate of the replica exchange method [14], which is one of the Markov chain Monte Carlo methods.…”
Section: Introductionmentioning
confidence: 99%