1991
DOI: 10.1109/21.101154
|View full text |Cite
|
Sign up to set email alerts
|

Multi-process constrained estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
40
0

Year Published

2005
2005
2012
2012

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 75 publications
(43 citation statements)
references
References 19 publications
0
40
0
Order By: Relevance
“…In the context of Bayesian estimation, a good measure of the quality of a sensing action is the reduction in entropy of the posterior distribution that is expected to be induced by the measurement. Therefore, it may be beneficial to choose the sensing action that maximizes the expected gain in information based on a variety of information-theoretic measures [114,115,[121][122][123][124][125].…”
Section: (I) Reduced Complexity Metrics For Trackingmentioning
confidence: 99%
“…In the context of Bayesian estimation, a good measure of the quality of a sensing action is the reduction in entropy of the posterior distribution that is expected to be induced by the measurement. Therefore, it may be beneficial to choose the sensing action that maximizes the expected gain in information based on a variety of information-theoretic measures [114,115,[121][122][123][124][125].…”
Section: (I) Reduced Complexity Metrics For Trackingmentioning
confidence: 99%
“…Specifically, the quality of a proposed action (be it moving the sensor to another location, or pointing an antenna in a particular direction) is measured by the amount of information that is expected to be gained by its execution. This approach is related to that of others, including Zhao [15], Hintz [16], Schmaedeke [17], and others [18], [19], [20] as discussed in [14] and elsewhere. At each epoch when a decision is to be made, the uncertainty about the surveillance region (as captured by the JMPD) is used to compute the value of each of the possible sensing actions using an information theoretic measure called the Rényi (alpha-) Divergence.…”
Section: Introductionmentioning
confidence: 99%
“…Work in [46][47][48][49][50][51][52][53] use information measures such as entropy and discrimination gain for goals such as determining resolution level of a sensor, determining priority of search and track tasks, etc. Hintz et al [9,10] use the Shannon entropy, as we do in this paper, while Schmaedeke and Kastella [11] have chosen to use Kullback-Leibler (KL) divergence as measure of information gain. Mahler [55][56][57] proposed Finite set statistics (FISST) which reformulates the problem of multi-sensor, multi-target data fusion as if it were a single-sensor, single-target tracking problem.…”
mentioning
confidence: 99%
“…The advantage of JMP is that there is a global notion of system tasks priority with all task priorities in a common space. Hintz and McVey [45] used entropy for search, track and ID tasks. Work in [46][47][48][49][50][51][52][53] use information measures such as entropy and discrimination gain for goals such as determining resolution level of a sensor, determining priority of search and track tasks, etc.…”
mentioning
confidence: 99%