1996
DOI: 10.1016/0167-8655(96)00069-4
|View full text |Cite
|
Sign up to set email alerts
|

Symbolic mapping of neurons in feedforward networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

1999
1999
2001
2001

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 18 publications
0
8
0
Order By: Relevance
“…The weights may be used to limit the search tree by providing the evaluation of contributions of inputs that are not specified in rule antecedents. As shown by Sethi and Yoo [21] the number of search nodes is then reduced to . In the Subset algorithm of Towell and Shavlik [22] inputs with largest weights are analyzed first, and if they are sufficient to activate the hidden node of the network irrespectively of the values on other inputs, a new rule is recorded.…”
Section: An Overview Of Neural Rule Extraction Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The weights may be used to limit the search tree by providing the evaluation of contributions of inputs that are not specified in rule antecedents. As shown by Sethi and Yoo [21] the number of search nodes is then reduced to . In the Subset algorithm of Towell and Shavlik [22] inputs with largest weights are analyzed first, and if they are sufficient to activate the hidden node of the network irrespectively of the values on other inputs, a new rule is recorded.…”
Section: An Overview Of Neural Rule Extraction Methodsmentioning
confidence: 99%
“…Analytical evaluation is based on the cumulative distribution function erf (20) where erf is the error function and makes the erf function similar to the standard unipolar sigmoidal function with the accuracy better than 2%. A rule with single crisp condition is fulfilled by a Gaussian number with probability (21) Taking instead of the erf function a logistic function corresponds to an assumption about the error distribution of from Gaussian to , approximating Gaussian distribution with within 3.5%. If the rule involves closed interval the probability that it is fulfilled by a sample from the Gaussian distribution representing the data is (22) Thus the probability that a given condition is fulfilled is proportional to the value of soft trapezoid function realized by L-unit.…”
Section: Probabilities From Crisp Rulesmentioning
confidence: 99%
“…In order to treat the positive and negative weights of the neuron uniformly, we have adopted an admissible transformation of weights ®rst used by Sethi and Yoo (1996a), to convert all the negative weights of the neuron to positive quantities. This transformation allows us to work with positive weights only.…”
Section: The Proposed Methodsmentioning
confidence: 99%
“…For binary inputs: Convert all the negative weights to positive weights by using the following admissible transformation (Sethi and Yoo, 1996a). Replace each input literal x, which has a negative weight with its negated literal, " x.…”
Section: The Combo Algorithm For Rule Generationmentioning
confidence: 99%
See 1 more Smart Citation