1978
DOI: 10.1109/tit.1978.1055832
|View full text |Cite
|
Sign up to set email alerts
|

On the entropy of continuous probability distributions (Corresp.)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
68
0
1

Year Published

1997
1997
2019
2019

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 155 publications
(73 citation statements)
references
References 5 publications
0
68
0
1
Order By: Relevance
“…Literature [44] listed several entropy of continuous probability distribution, and the entropy of continuous system with obeys normal distribution is:…”
Section: Algorithm Convergence Analysesmentioning
confidence: 99%
“…Literature [44] listed several entropy of continuous probability distribution, and the entropy of continuous system with obeys normal distribution is:…”
Section: Algorithm Convergence Analysesmentioning
confidence: 99%
“…Decreasing the resistance level may also result in lower fatigue probability and less frequent compensatory motion, which in turn may lead to longer duration of the exercise. To include these features into the current system, we are currently formulating a new probabilistic framework that models the users ability using Beta distributions [18] as a function of continuous resistance levels.…”
Section: Future Workmentioning
confidence: 99%
“…In information theory, the entropy was introduced by Claude Shannon [3] in 1948 for random variables with a discrete probability space. A natural extension of a discrete entropy is a differential entropy [4] defined for a continuous random variable with probability density function (pdf) p(x) by:…”
Section: Introductionmentioning
confidence: 99%