2012
DOI: 10.1007/978-3-642-33266-1_60
|View full text |Cite
|
Sign up to set email alerts
|

Estimating Quantities: Comparing Simple Heuristics and Machine Learning Algorithms

Abstract: Abstract. Estimating quantities is an important everyday task. We analyzed the performance of various estimation strategies in ninety-nine real-world environments drawn from various domains. In an extensive simulation study, we compared two classes of strategies: one included machine learning algorithms such as general regression neural networks and classification and regression trees, the other two psychologically plausible and computationally much simpler heuristics (QEst and Zig-QEst). We report the strateg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 11 publications
(8 reference statements)
0
3
0
Order By: Relevance
“…Previous research for other tasks, such as paired comparison (Czerlinski, Gigerenzer, & Goldstein, 1999; Martignon & Hoffrage, 2002) or estimation (Hertwig, Hoffrage, & Sparr, 2012; Woike, Hoffrage, & Hertwig, 2012), has shown that simple heuristics, when making inferences out of sample, can outperform models that are optimal when it comes to making inferences within the same sample for which the strategies fitted their parameters. We found the same pattern in our simulations: When making classifications out of sample with multiple cues, the two heuristic approaches, naïve Bayes and some of the fast-and-frugal trees, outperformed the model that was normative for the case of fitting known data, namely classification based on natural frequencies (or, equivalently, profile memorization).…”
Section: Discussionmentioning
confidence: 99%
“…Previous research for other tasks, such as paired comparison (Czerlinski, Gigerenzer, & Goldstein, 1999; Martignon & Hoffrage, 2002) or estimation (Hertwig, Hoffrage, & Sparr, 2012; Woike, Hoffrage, & Hertwig, 2012), has shown that simple heuristics, when making inferences out of sample, can outperform models that are optimal when it comes to making inferences within the same sample for which the strategies fitted their parameters. We found the same pattern in our simulations: When making classifications out of sample with multiple cues, the two heuristic approaches, naïve Bayes and some of the fast-and-frugal trees, outperformed the model that was normative for the case of fitting known data, namely classification based on natural frequencies (or, equivalently, profile memorization).…”
Section: Discussionmentioning
confidence: 99%
“…So how well do FFTs predict data relative to more complex algorithms when there are no computational restrictions? Prior research has suggested that FFTs can predict data as well as algorithms such as logistic regression and standard decision trees (Martignon et al, 2008;Woike, Hoffrage & Hertwig, 2012;Woike et al, 2017). However, logistic regression (in a non-regularised form) and standard decision 11For this example, 100 FFTs were constructed from 100 simulations using 50% of the full heart disease data.…”
Section: Prediction Simulationmentioning
confidence: 99%
“…Heuristic categorization and the visualization of fast-and-frugal trees was performed according to established heuristic tree construction algorithms as previously described [ 36 , 37 ].…”
Section: Methodsmentioning
confidence: 99%