2019 IEEE 29th International Workshop on Machine Learning for Signal Processing (MLSP) 2019
DOI: 10.1109/mlsp.2019.8918878
|View full text |Cite
|
Sign up to set email alerts
|

Randnet: Deep Learning with Compressed Measurements of Images

Abstract: Principal component analysis, dictionary learning, and autoencoders are all unsupervised methods for learning representations from a large amount of training data. In all these methods, the higher the dimensions of the input data, the longer it takes to learn. We introduce a class of neural networks, termed RandNet, for learning representations using compressed random measurements of data of interest, such as images. RandNet extends the convolutional recurrent sparse auto-encoder architecture to dense networks… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 10 publications
(31 reference statements)
0
8
0
Order By: Relevance
“…Deep unfolding was originally proposed by Greger and LeCun in [8], where a deep architecture was designed to learn to carry out the iterative soft thresholding algorithm (ISTA) for sparse recovery. Deep unfolded networks have since been applied in various applications in image denoising [7], [39], [40], sparse recovery [9], [25], [41], dictionary learning [42], [43], communications [19], [44]- [47], ultrasound [48], [49], and super resolution [50]- [52]. A recent review can be found in [7].…”
Section: A Deep Unfoldingmentioning
confidence: 99%
“…Deep unfolding was originally proposed by Greger and LeCun in [8], where a deep architecture was designed to learn to carry out the iterative soft thresholding algorithm (ISTA) for sparse recovery. Deep unfolded networks have since been applied in various applications in image denoising [7], [39], [40], sparse recovery [9], [25], [41], dictionary learning [42], [43], communications [19], [44]- [47], ultrasound [48], [49], and super resolution [50]- [52]. A recent review can be found in [7].…”
Section: A Deep Unfoldingmentioning
confidence: 99%
“…This method's novelty is in the hardware-efficient design of the operator Φ capturing data information when compressing. This architecture reduces to RandNet [19] when Φ and Cs have no structure (e.g. Toeplitz) and c = 1.…”
Section: Network Architecturementioning
confidence: 99%
“…Recent works proposed model-based neural networks to address computational efficiency [17,18], but still require full measurements for recovery. To enable compression, Chang et al [19] proposed an autoencoder, called RandNet, for dictionary learning. In this work, compression is achieved by projecting data into a lower-dimensional space through a data-independent unstructured random matrix.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations