2009
DOI: 10.1007/s10838-009-9091-3
|View full text |Cite
|
Sign up to set email alerts
|

Falsificationism and Statistical Learning Theory: Comparing the Popper and Vapnik-Chervonenkis Dimensions

Abstract: We compare Karl Popper's ideas concerning the falsifiability of a theory with similar notions from the part of statistical learning theory known as VC-theory. Popper's notion of the dimension of a theory is contrasted with the apparently very similar VC-dimension. Having located some divergences, we discuss how best to view Popper's work from the perspective of statistical learning theory, either as a precursor or as aiming to capture a different learning activity.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
24
0

Year Published

2011
2011
2020
2020

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 45 publications
(26 citation statements)
references
References 6 publications
0
24
0
Order By: Relevance
“…5.3. Some philosophers have argued that statistical learning provides a rigorous foundation for all inductive reasoning (Corfield et al 2009;Harman and Kulkarni 2007). Although we are sympathetic to this position, none of the proceeding analysis depends upon this thesis.…”
Section: Supervised Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…5.3. Some philosophers have argued that statistical learning provides a rigorous foundation for all inductive reasoning (Corfield et al 2009;Harman and Kulkarni 2007). Although we are sympathetic to this position, none of the proceeding analysis depends upon this thesis.…”
Section: Supervised Learningmentioning
confidence: 99%
“…Popper's "degree of falsifiability" arguably anticipates the VC dimension. For a discussion, seeCorfield et al (2009).2 The completeness of the do-calculus relies on the causal Markov and faithfulness conditions, which together state (roughly) that statistical independence implies graphical independence and vice versa. Neither assumption has gone unchallenged.…”
mentioning
confidence: 99%
“…In 1986, completed the general learning machine structure after the research to the spread of technology. In the meantime, also produce the statistical learning theory [5] and has large development, produced the empirical risk minimization principle theory and algorithm complexity thoughts.…”
Section: Related Workmentioning
confidence: 99%
“…Related work. The connection between Popper's ideas on falsifiability and statistical learning theory was pointed out in [5,7,14]. However, these works focus on VC-dimension, which does not relate to falsification as directly as VC-entropy and Rademacher complexity which we consider here.…”
Section: Introductionmentioning
confidence: 98%