2012
DOI: 10.1109/tmag.2011.2173303
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Cross-Entropy Method Applied to Inverse Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 10 publications
0
3
0
Order By: Relevance
“…Shannon entropy was further expanded, i.e. cumulative residual entropy [15][16][17], joint entropy [18,19], conditional entropy [20][21][22], exponential entropy [23], mutual information entropy [24,25], cross entropy [26][27][28], maximum entropy principle [29,30] and so on. These results have prompted the development of information entropy.…”
Section: Shannon Information Theorymentioning
confidence: 99%
“…Shannon entropy was further expanded, i.e. cumulative residual entropy [15][16][17], joint entropy [18,19], conditional entropy [20][21][22], exponential entropy [23], mutual information entropy [24,25], cross entropy [26][27][28], maximum entropy principle [29,30] and so on. These results have prompted the development of information entropy.…”
Section: Shannon Information Theorymentioning
confidence: 99%
“…Each algorithm runs 10 times independently. Also, the results obtained by using five other optimal algorithms DE1, DE2, ARDGDE1, ARDGDE2 (Coelho et al, 2009) and improved cross entropy method (An et al, 2012) have been taken from the literature for comparisons. Tables VIII and IX summarize the optimal results of different optimal approaches.…”
Section: Applicationmentioning
confidence: 99%
“…Large amount of following studies have successively introduced the concepts of the expansion of Hartley entropy and Shannon entropy [16], relative entropy [17], cumulative residual entropy [18][19][20][21], joint entropy [22,23], conditional entropy [24][25][26], mutual information [27][28][29][30][31][32], cross entropy [33][34][35][36][37][38], fuzzy entropy [15,39], maximum entropy principle [40,41] and minimum cross-entropy principle [42,43], and a series of achievements have been made in these aspects. Zhong makes use of general information functions to unify the methods of describing information metrics with Entropy formulas [4].…”
Section: About the Metrics Of Informationmentioning
confidence: 99%