1994
DOI: 10.1117/12.188904
|View full text |Cite
|
Sign up to set email alerts
|

<title>Optimization and application of a RAM-based neural network for fast image processing tasks</title>

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

1999
1999
1999
1999

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…As suggested in connection with equation (5) it can be advantageous to use individual values of the threshold level k forthe different classes. The challenge, however, is to fmd a design principle that is able to set the values properly.…”
Section: Selection Of the Threshold Level Kmentioning
confidence: 99%
See 1 more Smart Citation
“…As suggested in connection with equation (5) it can be advantageous to use individual values of the threshold level k forthe different classes. The challenge, however, is to fmd a design principle that is able to set the values properly.…”
Section: Selection Of the Threshold Level Kmentioning
confidence: 99%
“…For all LUTs calculate their corresponding column entries, and set the output value of the target class to one in all the "active" columns. 5. for all examples in the training set.…”
Section: Tile Standard N-tuple Classifiermentioning
confidence: 99%
“…Estimating the generalization capabilities of a classifier can be done by performing a leave-one-out cross-validation where in turn a single example is left out of the training set; i.e., the classifier is trained on the reminder of the training set and tested on the one example excluded from the training set. As pointed out by Liisberg and described in [10] it is easy to incorporate a leave-one-out cross-validation test for the n-tuple classifier. The leave-one-out cross-validation classification of a training example x can be calculated as…”
Section: The N-tuple Classifiermentioning
confidence: 99%
“…A large average Hamming distance corresponds to a large variation or noise level (i.e., the probability, q c , of inverting the value of a bit) for the distribution in question. Expression (10) involves the calculation of overlap areas between different numbers of training examples and one test example. In order to average (10) over the training sets, the overlap areas must be replaced with the calculation of the average overlap areas between different numbers of training examples and a test example.…”
Section: Calculation Of Expected Output Scoresmentioning
confidence: 99%