2020
DOI: 10.1609/aaai.v34i04.6016
|View full text |Cite
|
Sign up to set email alerts
|

Empirical Bounds on Linear Regions of Deep Rectifier Networks

Abstract: We can compare the expressiveness of neural networks that use rectified linear units (ReLUs) by the number of linear regions, which reflect the number of pieces of the piecewise linear functions modeled by such networks. However, enumerating these regions is prohibitive and the known analytical bounds are identical for networks with same dimensions. In this work, we approximate the number of linear regions through empirical bounds based on features of the trained network and probabilistic inference. Our first … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
38
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(43 citation statements)
references
References 25 publications
0
38
0
Order By: Relevance
“…However, enumerating solutions of a mixed-integer program can be computationally intractable. To address this issue, a probabilistic algorithm was proposed in [97] to produce lower bounds to the number of possible solutions.…”
Section: ) Mixed-integer Formulationsmentioning
confidence: 99%
See 1 more Smart Citation
“…However, enumerating solutions of a mixed-integer program can be computationally intractable. To address this issue, a probabilistic algorithm was proposed in [97] to produce lower bounds to the number of possible solutions.…”
Section: ) Mixed-integer Formulationsmentioning
confidence: 99%
“…, aK x + bK y + cK ) . (97) This consists of K planes of unknown slopes estimated by using K-means on the numerical gradients of the 2-D data, whereas the intercepts c k are computed using the tropical fitting algorithm as in (93), which solves the unconstrained ∞ problem. In this combined approach, the first step (K-means) is heuristic, yielding probably a local minimum for the slope estimation subproblem, whereas the second step (tropical regression for the intercepts) yields a global minimum optimally solving the unconstrained ∞ problem.…”
Section: Optimally Fitting Tropical Polynomial Curves and Surfacesmentioning
confidence: 99%
“…Modeling Optimization Applications Involving Neural Network Surrogates Related Work. Prior approaches to optimization over neural networks using MIP formulations include [8,15,16,28,47,51]. These approaches primarily model the piecewise ReLU constraint using standard big-M modeling tricks.…”
Section: Outline and Contributionsmentioning
confidence: 99%
“…This way, equivalent neural networks of smaller size can be obtained. The computation of linear regions in ReLU neural networks is another field of application [27].…”
Section: Related Workmentioning
confidence: 99%