2019
DOI: 10.1007/978-3-030-20867-7_24
|View full text |Cite
|
Sign up to set email alerts
|

Max-Plus Operators Applied to Filter Selection and Model Pruning in Neural Networks

Abstract: Following recent advances in morphological neural networks, we propose to study in more depth how Max-plus operators can be exploited to define morphological units and how they behave when incorporated in layers of conventional neural networks. Besides showing that they can be easily implemented with modern machine learning frameworks, we confirm and extend the observation that a Max-plus layer can be used to select important filters and reduce redundancy in its previous layer, without incurring performance lo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 25 publications
(17 citation statements)
references
References 18 publications
0
17
0
Order By: Relevance
“…A possible improvement would be to impose a major sparsity to the dictionary images with an appropriate regularization. Additionally, using a morphological layer [3,21,32] as a decoder may be more consistent with our definition of part-based approximation, since a representation in the (max, +) algebra would commute with the morphological dilation by essence.…”
Section: Discussionmentioning
confidence: 69%
See 1 more Smart Citation
“…A possible improvement would be to impose a major sparsity to the dictionary images with an appropriate regularization. Additionally, using a morphological layer [3,21,32] as a decoder may be more consistent with our definition of part-based approximation, since a representation in the (max, +) algebra would commute with the morphological dilation by essence.…”
Section: Discussionmentioning
confidence: 69%
“…Hence, we take advantage of the recent advances in deep, sparse and non-negative auto-encoders to design a new framework able to learn part-based representations of an image database, compatible with morphological processing. To that extent, this work is part of the resurgent research line investigating interactions between deep learning and mathematical morphology [9,22,23,27,32]. However with respect to these studies, focusing mainly on introducing morphological operators in neural networks, the present paper addresses a different question.…”
Section: Introductionmentioning
confidence: 99%
“…10. Even though the morphological paradigm dates back almost 3 decades [91], [92], [103], [113], a recent resurgence of interest has led to new developments; for example, it was recently shown [83], [115] that a morphological neural network with a hidden layer consisting of dilations and erosions followed by a linear layer is a universal approximator. In a more recent publication [34], the authors focus on deep learning for image processing, treating all nonlinear operations (e.g., max-pooling) as trainable morphological operators to complement trainable convolutional operations and achieve competitive results in tasks, such as boundary detection using considerably fewer parameters than other architectures.…”
Section: ) Tropical Polynomial Division and Network Simplificationmentioning
confidence: 99%
“…The name "tropical semiring" initially referred to the min-plus semiring and was used in finite automata [57], [99], speech recognition using graphical models [82], and tropical geometry [68], [80]. However, nowadays, the term, tropical semiring, may refer to both the max-plus and its dual min-plus arithmetic, whose combinations with corresponding nonlinear matrix algebra and nonlinear signal convolutions have been used in operations research and scheduling [25]; discrete event systems, max-plus control, and optimization [1], [2], [6], [15], [22], [37], [39], [48], [78], [110]; convex analysis [65], [85], [94]; morphological image analysis [49], [73], [79], [95], [96]; nonlinear difference equations for distance transforms [11], [71]; nonlinear PDEs of the Hamilton-Jacobi type for vision scale spaces [14], [50]; speech recognition and natural language processing [56], [82]; neural networks [18], [19], [34], [40], [83], [89], [93], [103], [114], [115]; and idempotent mathematics (nonlinear functional analysis) [63], [64].…”
Section: Introductionmentioning
confidence: 99%
“…Works [30,35,34] on the first category investigates the use of the perceptron in the (max, +) and (min,+) algebra. The morphological perceptron layer proposed in [35] is a fully connected layer, followed by a max-plus layer [36]:…”
Section: Morphological Operators In Perceptronsmentioning
confidence: 99%