2018
DOI: 10.1109/msp.2017.2766239
|View full text |Cite
|
Sign up to set email alerts
|

Introducing Information Measures via Inference [Lecture Notes]

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2018
2018

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 9 publications
0
1
0
Order By: Relevance
“…The latter is a measure of the distance between the two distributions, as we will further discuss in Sec. V-D (see [59], [60]). The analytical advantages of the ELBO L(q, θ) over the original log-likelihood are that: (i) it entails an expectation of the logarithm of the model p(x|z, θ), which, as mentioned, is typically a tractable function; and (ii) the average is over a fixed distribution q(z), which does not depend on the model parameter θ.…”
Section: Learningmentioning
confidence: 99%
“…The latter is a measure of the distance between the two distributions, as we will further discuss in Sec. V-D (see [59], [60]). The analytical advantages of the ELBO L(q, θ) over the original log-likelihood are that: (i) it entails an expectation of the logarithm of the model p(x|z, θ), which, as mentioned, is typically a tractable function; and (ii) the average is over a fixed distribution q(z), which does not depend on the model parameter θ.…”
Section: Learningmentioning
confidence: 99%