1991
DOI: 10.1007/bf00114160
|View full text |Cite
|
Sign up to set email alerts
|

Symbolic and neural learning algorithms: An experimental comparison

Abstract: Abstract. Despite the fact that many symbolic and neural network (connectionist) learning algorithms address the same problem of learning from classified examples, very little is known regarding their comparative strengths and weaknesses. Experiments comparing the ID3 symbolic learning algorithm with the perceptron and backpropagation neural learning algorithms have been performed using five large, real-world data sets, Overall, backpropagation performs slightly better than the other two algorithms in terms of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
105
0
2

Year Published

1993
1993
2011
2011

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 189 publications
(111 citation statements)
references
References 29 publications
2
105
0
2
Order By: Relevance
“…For comparison, we give results from the NETtalk program, which used the back propagation algorithm. Shavlik et al (1989) replicated Sejnowski and Rosenberg's methodology as ,part of their work, and although their results differ from Sejnowski and Rosenberg's (not surprisingly, since back propagation networks require much tuning), they make for easier comparison with ours. This property follows from the fact that the original Sejnowski and Rosenberg study used a distributed output encoding; that is, their system produced a 26-bit sequence (rather than one bit for each of the 115 phoneme/stress combinations).…”
Section: English Text Pronunciationsupporting
confidence: 55%
See 3 more Smart Citations
“…For comparison, we give results from the NETtalk program, which used the back propagation algorithm. Shavlik et al (1989) replicated Sejnowski and Rosenberg's methodology as ,part of their work, and although their results differ from Sejnowski and Rosenberg's (not surprisingly, since back propagation networks require much tuning), they make for easier comparison with ours. This property follows from the fact that the original Sejnowski and Rosenberg study used a distributed output encoding; that is, their system produced a 26-bit sequence (rather than one bit for each of the 115 phoneme/stress combinations).…”
Section: English Text Pronunciationsupporting
confidence: 55%
“…These domains have received considerable attention from connectionist researchers who employed the back propagation learning algorithm (Sejnowski & Rosenberg, 1986;Qian & Sejnowski, 1988;Towell et al, 1990). In addition, the word pronunciation problem has been the subject of a number of comparisons using other machine learning algorithms (Stanfill & Waltz, 1986;Shavlik et al, 1989;Dietterich et al, 1990). All of these domains represent problems of considerable practical importance, and all have symbolic feature values, which makes them difficult for conventional nearest neighbor algorithms.…”
Section: Instance-based Learning Versus Other Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…The same version was used in the experiments reported by Shavlik et al (1991). It learns a single tree for classifying examples into rnultiple categories and uses the normal information-gain splitting criterion.…”
Section: Decision Tree Learnermentioning
confidence: 99%