1992
DOI: 10.1117/12.130857
|View full text |Cite
|
Sign up to set email alerts
|

<title>Training-set-based performance measures for data-adaptive decisioning systems</title>

Abstract: Performance measures are derived for data-adaptive hypothesis testing by systems trained on stochastic data. The measures consist of the averaged performance of the system over the ensemble of training sets. The training set-based measures are contrasted with maximum aposteriori probability (MAP) test measures. It is shown that the training set-based and MAP test probabilities are equal if the training set is proportioned according to the prior probabilities of the hypotheses. Applications of training set-base… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

1993
1993
2016
2016

Publication Types

Select...
2
2
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 0 publications
0
5
0
Order By: Relevance
“…Our motivation for using a back propagation neural net for sensor fusion is its successful performance, as shown by Levine and Khuon in [7], [9], [10], and [11]. In this section we review some of these results, where successful data fusion by Chair and Varshney [5] in terms of probability of detection and probability of false alarms were shown to be achieved similarly by a cascaded neural net by Levine and Khuon [6][7][8][9].…”
Section: Motivation For Fusion Neural Netmentioning
confidence: 92%
See 2 more Smart Citations
“…Our motivation for using a back propagation neural net for sensor fusion is its successful performance, as shown by Levine and Khuon in [7], [9], [10], and [11]. In this section we review some of these results, where successful data fusion by Chair and Varshney [5] in terms of probability of detection and probability of false alarms were shown to be achieved similarly by a cascaded neural net by Levine and Khuon [6][7][8][9].…”
Section: Motivation For Fusion Neural Netmentioning
confidence: 92%
“…In this section we review some of these results, where successful data fusion by Chair and Varshney [5] in terms of probability of detection and probability of false alarms were shown to be achieved similarly by a cascaded neural net by Levine and Khuon [6][7][8][9]. The data fusion rule for a binary decision was obtained within the distributed sensor processing architecture.…”
Section: Motivation For Fusion Neural Netmentioning
confidence: 95%
See 1 more Smart Citation
“…Our motivation comes from observing the successful performance of NNSF methods in scenarios relevant to our application; specifically, the successful data fusion by Chair and Varshney 31 was shown to be achieved similarly (in terms of probability of detection and probability of false alarms) by a cascaded neural net by Levine and Khuon. [32][33][34][35][36] Successful performance of neural networks was achieved for nonlinear detection and classification problems for imagery data fusion in this research. A detailed discussion of the FNN for data/ sensor fusion can be found in Refs.…”
Section: Spectral-spatial Neural Net Sensor Fusionmentioning
confidence: 95%
“…The performance of the classification/identification can be evaluated by the error probabilities, which can be theoretically calculated through hypothesis testing models. 1) Classification: For N-users classification scenario, we adopt N-hypothesis testing techniques to apply to N genuine users with M N reference fingerprints in the database for each user [23]. The hypothesis H N is that the obtained signal is from the genuine user #N. Then the classification probability can be expressed by P(H i |H j ), i, j = 1, ..., N, i j, which is the probability that the RFF from genuine user # j is classified as the genuine user #i.…”
Section: Rff Extraction Identification and Classificationmentioning
confidence: 99%