1986
DOI: 10.1145/7902.7906
|View full text |Cite
|
Sign up to set email alerts
|

Toward memory-based reasoning

Abstract: The intensive use of memory to recall specific episodes from the pastrather than rules-should be the foundation of machine reasoning. CRAIG STANFILL and DAVID WALTZThe traditional assumption in artificial intelligence (Al) is that most expert knowledge is encoded in the form of rules. We consider the phenomenon of reasoning from memories of specific episodes, however, to be the foundation of an intelligent system, rather than an adjunct to some other reasoning method. This theory contrasts with much of the cur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
390
0
5

Year Published

1993
1993
2015
2015

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 963 publications
(400 citation statements)
references
References 21 publications
2
390
0
5
Order By: Relevance
“…A more sophisticated alternative consists of considering two symbolic values to be similar if they make similar predictions (i.e., if they correlate similarly with the class feature). This was first proposed by Stanfill and Waltz (1986) as part of their value difference metric (VDM) for a memory-based reasoner. Here we will consider a simplified version of the VDM, which defines the distance between two symbolic values as:…”
Section: Instance-based Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…A more sophisticated alternative consists of considering two symbolic values to be similar if they make similar predictions (i.e., if they correlate similarly with the class feature). This was first proposed by Stanfill and Waltz (1986) as part of their value difference metric (VDM) for a memory-based reasoner. Here we will consider a simplified version of the VDM, which defines the distance between two symbolic values as:…”
Section: Instance-based Learningmentioning
confidence: 99%
“…The total distance ∆(E 1 , E 2 ) is computed as before. Different variants of this metric have been successfully used in pronunciation, molecular biology and other tasks (Stanfill & Waltz, 1986;Cost & Salzberg, 1993;Biberman, 1994). IBL methods need to address the problem of sensitivity to irrelevant attributes.…”
Section: Instance-based Learningmentioning
confidence: 99%
“…Obviously the Manhattan distance function is only appropriate for continuous or scaled parameters p. For discrete parameters we implemented the Value Difference Metric (VDM) as proposed in [15] and improved in [16]. Given two findings f 1 = (p = x) and f 2 = (p = y) the VDM defines the distance between the two values x and y of parameter p:…”
Section: Similarity Knowledge For Parameter Valuesmentioning
confidence: 99%
“…These domains have received considerable attention from connectionist researchers who employed the back propagation learning algorithm (Sejnowski & Rosenberg, 1986;Qian & Sejnowski, 1988;Towell et al, 1990). In addition, the word pronunciation problem has been the subject of a number of comparisons using other machine learning algorithms (Stanfill & Waltz, 1986;Shavlik et al, 1989;Dietterich et al, 1990). All of these domains represent problems of considerable practical importance, and all have symbolic feature values, which makes them difficult for conventional nearest neighbor algorithms.…”
Section: Instance-based Learning Versus Other Modelsmentioning
confidence: 99%
“…Our algorithm constructs modified "value difference" tables (in the style of Stanfill & Waltz (1986)) to produce a non-Euclidean distance metric, and we introduce the idea of "exception spaces" that result when weights are attached to individual examples. The combination of these two techniques results in a robust instance-based learning algorithm that works for any domain with symbolic feature values.…”
Section: Introductionmentioning
confidence: 99%