1998
DOI: 10.1142/9789812816849_0008
|View full text |Cite
|
Sign up to set email alerts
|

Cross-Validation and Information Measures for Ram Based Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

1999
1999
2009
2009

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 1 publication
0
7
0
Order By: Relevance
“…Va.c =O=p(cIa1)=, o<&<<l. (6) The next problem one is confronted with is how to combine the probability estimates of the individual features. As the features and therefore also the probabilities in general are not independent we can not just multiply the individual probability estimates p(cJa1 ) to obtain p(c{a1 }) -the probability of class c given the whole feature set {a1 } sampled by the n-tuples.…”
Section: Standard Scheme Versus Frequentist Schemementioning
confidence: 99%
“…Va.c =O=p(cIa1)=, o<&<<l. (6) The next problem one is confronted with is how to combine the probability estimates of the individual features. As the features and therefore also the probabilities in general are not independent we can not just multiply the individual probability estimates p(cJa1 ) to obtain p(c{a1 }) -the probability of class c given the whole feature set {a1 } sampled by the n-tuples.…”
Section: Standard Scheme Versus Frequentist Schemementioning
confidence: 99%
“…There will be a vast number of possible connections for a matrix with the dimension like 32 by 32. The classification and generalization performance are highly dependent on these input mappings [5] [13]. A random map is suitable for an unoptimised problem [3] as it samples the point throughout the pattern matrix.…”
Section: Introductionmentioning
confidence: 99%
“…There will be a vast number of possible connections for a matrix with the dimension like 32 by 32. The classification and generalization performance are highly dependant on these input mappings (Bishop, 1990;Jorgensen et al, 1995). A random map is suitable for an un-optimized problem as it samples the point throughout the pattern matrix (Aleksander, 1979).…”
Section: Introductionmentioning
confidence: 99%
“…A random map is suitable for an un-optimized problem as it samples the point throughout the pattern matrix (Aleksander, 1979). Considerable research shows that by optimizing the connections classification performance can be improved significantly (Azhar & Dimond, 2004a, 2004b, 2004cBishop, 1990;Jorgensen et al, 1995). Among different optimization techniques, the Particle Swarm Optimization (PSO) (Kennedy & Eberhart, 1995) exhibits good performance in finding solutions to static optimization problems .…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation