2013
DOI: 10.1007/978-3-642-44958-1_5
|View full text |Cite
|
Sign up to set email alerts
|

Falsification and Future Performance

Abstract: Abstract. We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset that a learning algorithm falsifies when it finds the classifier in its repertoire minimizing empirical risk. It then follows from that the future performance of predictors on unseen data is controlled in part by how many hypotheses the learner falsifies. As a corolla… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
3
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 18 publications
(24 reference statements)
0
6
0
Order By: Relevance
“…More specifically, effective information is the mutual information following an experimenter intervening to set a system or part of a system to maximum entropy. It quantifies the number of YES/NO questions required to produce an output from an input, thus measuring the ‘work’ the system does in selecting that output [9].…”
Section: Introductionmentioning
confidence: 99%
“…More specifically, effective information is the mutual information following an experimenter intervening to set a system or part of a system to maximum entropy. It quantifies the number of YES/NO questions required to produce an output from an input, thus measuring the ‘work’ the system does in selecting that output [9].…”
Section: Introductionmentioning
confidence: 99%
“…For example, the meaning of P blue (that) = "that is blue" is the subset v A simple extension of possible world semantics from propositions to arbitrary functions is as follows (Balduzzi, 2011): D 1 (semantics). Given function f: X → Y, the semantics or meaning of output y ∈ Y is the ordered pair of sets:…”
Section: Semantics and Representationsmentioning
confidence: 99%
“…We show this holds for the well-studied special case of empirical risk minimization (ERM). Results are taken from [4], which should be consulted for details.…”
Section: Learningmentioning
confidence: 99%
“…It is easy to show that ei(E F ,D , 0) = ℓ − V C F (D), where V C F (D) is the empirical VC-entropy [4,12]. It follows with high probability that…”
Section: Learningmentioning
confidence: 99%
See 1 more Smart Citation