1998
DOI: 10.1109/5326.725342
|View full text |Cite
|
Sign up to set email alerts
|

An empirical measure of element contribution in neural networks

Abstract: A frequent complaint about neural net models is that they fail to explain their results in any useful way. The problem is not a lack of information, but an abundance of information that is difficult to interpret. When trained, neural nets will provide a predicted output for a posited input, and they can provide additional information in the form of interelement connection strengths. But this latter information is of little use to analysts and managers who wish to interpret the results they have been given. In … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0

Year Published

2000
2000
2019
2019

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 32 publications
(33 citation statements)
references
References 36 publications
0
33
0
Order By: Relevance
“…The influence of variables is measured by the relation between the weights associated with each input node and the sum of all synaptic weights. Studies by Mak and Blanning (1998) and Guha et al (2005) also give variants of this method. The third algorithm is based on the calculation of partial derivatives of ANN outputs with respect to the inputs (see Dimopoulos et al, 1995).…”
Section: Revision Of Methods For Selecting and Determining The Contrimentioning
confidence: 94%
“…The influence of variables is measured by the relation between the weights associated with each input node and the sum of all synaptic weights. Studies by Mak and Blanning (1998) and Guha et al (2005) also give variants of this method. The third algorithm is based on the calculation of partial derivatives of ANN outputs with respect to the inputs (see Dimopoulos et al, 1995).…”
Section: Revision Of Methods For Selecting and Determining The Contrimentioning
confidence: 94%
“…To address the rates of change across the hidden layer, Mak and Blanning [22] developed the following index to assess the impact of input i on output k:…”
Section: Rule Extraction and Measures Of Input Contribution In Neuralmentioning
confidence: 99%
“…In total 233 cases were collected, 105 cases were used for training and 128 cases for testing. Details of the knowledge acquisition process are found in [22,23].…”
Section: An Empirical Comparison Of Id3 Rough Sets and Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations