1999
DOI: 10.1088/0953-4075/32/3/003
|View full text |Cite
|
Sign up to set email alerts
|

Minimum-cross-entropy estimation of atomic charge densities from scattering factors

Abstract: Abstract. Tight model-independent approximations to the one-particle atomic density ρ(r), derived from very few values of the form factor F (k), are obtained by means of the minimumcross-entropy technique. For completeness, the accuracy of the approximations is analysed within a Hartree-Fock framework.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2002
2002
2015
2015

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 30 publications
(32 reference statements)
0
7
0
Order By: Relevance
“…The Kullback‐Leibler divergence or relative entropy 14 is perhaps the most important non symmetric divergence measure in the Information Theory, being defined as which embodies the well known Shannon entropy 1, which constitutes a global measure of the spread of the distribution over its whole domain. The KL divergence, as well as its symmetrized version have been extensively studied and applied in a great variety of fields such as, for instance, minimum cross entropy estimation 36 or indexing and image retrieval 37.…”
Section: Dissimilarity Measures: the Jensen‐shannon Divergencementioning
confidence: 99%
“…The Kullback‐Leibler divergence or relative entropy 14 is perhaps the most important non symmetric divergence measure in the Information Theory, being defined as which embodies the well known Shannon entropy 1, which constitutes a global measure of the spread of the distribution over its whole domain. The KL divergence, as well as its symmetrized version have been extensively studied and applied in a great variety of fields such as, for instance, minimum cross entropy estimation 36 or indexing and image retrieval 37.…”
Section: Dissimilarity Measures: the Jensen‐shannon Divergencementioning
confidence: 99%
“…The KL divergence constitutes an essential tool within the information theory, as shown by its applications for obtaining minimum cross entropy estimations and for determining atomic [36] and molecular [37] properties, among others. More recent applications include the introduction of an informational quantum dissimilarity measure to study the relativistic effects on the electron density [15], or the employment of KL measures to analyze molecular reaction paths [38].…”
Section: Jensen-shannon and Kullback-leibler Divergencesmentioning
confidence: 99%
“…Especially remarkable is its property of non-negativity, and the minimum null value is reached only for identical distributions ρ 1 = ρ 2 . The KL relative entropy constitutes an essential tool within information theory, as shown by its applications for obtaining minimum cross-entropy estimations and for determining atomic [28] and molecular [29] properties, among others. More recent applications include the introduction of an informational quantum dissimilarity measure to study the relativistic effects on the electron density [16], or the employment of KL measures to analyze molecular reaction paths [30].…”
Section: Jensen-shannon and Jensen-fisher Divergencesmentioning
confidence: 99%