2010
DOI: 10.1007/s11554-009-0144-y
|View full text |Cite
|
Sign up to set email alerts
|

Fast computation methods for estimation of image spatial entropy

Abstract: Computation of image spatial entropy (ISE) is prohibitive in many applications of image processing due to its high computational complexity. Four fast or computationally efficient methods for estimation of ISE are thus introduced in this paper. Three of these estimation methods are parametric and the fourth one is non-parametric. The reduction in the computational complexity from the original formulation of ISE is made possible by making use of the Markovianity constraint which causes the joint histograms of n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 15 publications
(5 citation statements)
references
References 10 publications
(14 reference statements)
0
5
0
Order By: Relevance
“…Some computational improvements for calculating the spatial entropy were proposed by Razlighi et al [ 52 , 53 ]. The computational overhead can be significantly reduced if we accept a reduction of the approximation accuracy.…”
Section: Discussionmentioning
confidence: 99%
“…Some computational improvements for calculating the spatial entropy were proposed by Razlighi et al [ 52 , 53 ]. The computational overhead can be significantly reduced if we accept a reduction of the approximation accuracy.…”
Section: Discussionmentioning
confidence: 99%
“…In this approach, a causal MRF model, called Quadrilateral Markov Random Field (QMRF), was used to compute image spatial information under the definition of Shannon entropy. The spatial entropy defined in [6] was further simplified for homogenous but anisotropic QMRF in [29] for nearest neighboring structures as per the following equation: \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$$\eqalignno{&H({{\bf X}})=mn(H(X,X_{u})+H(X,X_{l})\cr&~~\quad\qquad-H(X))-{{mn}\over{2}}(H(X_{u},X_{l})+H(X_{u},X_{r}))&{\hbox{(5)}}}$$\end{document} where \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$m\times n$\end{document} denotes the image size, \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$H(X,X_{u})$\end{document} the joint entropy of a voxel with its upper neighbor, \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$H(X,X_{l})$\end{document} the joint entropy of a voxel with its left neighbor, \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$H(X_{u},X_{l})$\end{document} the joint entropy of the left and upper neighbors, and \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$H(X_{u},X_{r})$\end{document} the joint entropy of the right and upper neighbors; see Fig. 2 .…”
Section: Spatial Mutual Informationmentioning
confidence: 99%
“…Finally, it should be added that in the case of 3D images all the existing cliques in the neighboring structure of the first order QMRF need to be included. Even though these additional cliques would not change the final outcome for the spatial mutual information of a homogeneous random field (as described in [29] ) for \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}${SMI}_{3D}$\end{document} , it is necessary to include them to ensure the positivity constraint of this similarity measure. This way, the new SMI is given by the following equation, \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$$\eqalignno{&\!\!\!\!SMI\cr &\!\!\!=\!-mnH(X,Y)\!+\!{{mn}\over{4}}\left\{\!\!{\matrix{H(X,Y_{u})\!+\!H(X_{u},Y)\!+\!H(X,Y_{d})\hfill\cr\!+H(X_{d},Y)\!+\!H(X,Y_{l})\!+\!H(X_{l},Y)\hfill\cr\!+H(X,Y_{r})\!+\!H(X_{r},Y)\hfill}}\!\!\right\}\cr&-{{mn}\over{8}}\left\{\!…”
Section: Spatial Mutual Informationmentioning
confidence: 99%
“…are the same as in (13), with their symmetric counterparts. It should be noted that H ( X l , Y ) and its symmetric counterpart H ( X , Y l ) are omitted from Equation (14) since H ( X l , Y ) = H ( X r ,Y ) from a computational standpoint [33]. To prevent confusion we denote the original SMI defined by equation (13) as SMI NS .…”
Section: Similarity Measuresmentioning
confidence: 99%