ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2022
DOI: 10.1109/icassp43922.2022.9746186
|View full text |Cite
|
Sign up to set email alerts
|

Channel Redundancy and Overlap in Convolutional Neural Networks with Channel-Wise NNK Graphs

Abstract: Feature spaces in the deep layers of convolutional neural networks (CNNs) are often very high-dimensional and difficult to interpret. However, convolutional layers consist of multiple channels that are activated by different types of inputs, which suggests that more insights may be gained by studying the channels and how they relate to each other. In this paper, we first analyze theoretically channelwise non-negative kernel (CW-NNK) regression graphs, which allow us to quantify the overlap between channels and… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 7 publications
(7 citation statements)
references
References 23 publications
(33 reference statements)
1
6
0
Order By: Relevance
“…The local intrinsic dimension of a manifold can be estimated as the number of neighbors selected by NNK. It was shown in [43,50], that the number of neighbors per polytope, i.e., n i , correlates with the local dimension of the manifold around a data point i. This observation is consistent with geometric intuitions; the number of NNK neighbors is decided based on the availability of data that span orthogonal directions, i.e., the local subspace of manifold:…”
Section: Intrinsic Dimension Metricsupporting
confidence: 68%
“…The local intrinsic dimension of a manifold can be estimated as the number of neighbors selected by NNK. It was shown in [43,50], that the number of neighbors per polytope, i.e., n i , correlates with the local dimension of the manifold around a data point i. This observation is consistent with geometric intuitions; the number of NNK neighbors is decided based on the availability of data that span orthogonal directions, i.e., the local subspace of manifold:…”
Section: Intrinsic Dimension Metricsupporting
confidence: 68%
“…Garg et al [42] also leveraged unlabeled data for early stopping but focused on providing a theoretical perspective on generalization. Bonet et al [43] proposed a channel-wise early stopping method, which utilizes hidden features in a convolutional neural network (CNN); this method is focused on a CNN model in computer vision. Mahsereci et al [25] proposed an evidence-based (EB) stop criterion that utilizes the gradients of the training samples.…”
Section: Related Workmentioning
confidence: 99%
“…NNK has been shown to perform well in several machine learning tasks [15], image representation [16], and generalization estimation in neural networks [17]. Furthermore, NNK has also been used to understand convolutional neural networks (CNN) channel redundancy [18] and to propose an early stopping criterion for them [19]. Graph properties (not necessarily based on NNK graphs) have been also proposed for the understanding and interpretation of deep neural network performance [20], latent space geometry [21,22] and to improve model robustness [23].…”
Section: Non-negative Kernel (Nnk) Regression Graphsmentioning
confidence: 99%