2014 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting 2014
DOI: 10.1109/bmsb.2014.6873510
|View full text |Cite
|
Sign up to set email alerts
|

Big Data ‘Fork’: Tensor Product for DCT-II/DST-II/ DFT/HWT

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 14 publications
0
1
0
Order By: Relevance
“…1. an input layer with P nodes; 2. a flatten layer; 3. a DCT transform layer with P nodes using fixed parameters, that is, Kronecker product form of sparse transform matrix Q = R ⊗ R A applied to block vectors, where R is the discrete cosine transform (DCT) matrix applied to block matrices [12]; 4. a trainable compressed sensing layer with P O nodes, O ≪ 1;…”
Section: B Class-specific Neural Network For Block Compressed Sensingmentioning
confidence: 99%
“…1. an input layer with P nodes; 2. a flatten layer; 3. a DCT transform layer with P nodes using fixed parameters, that is, Kronecker product form of sparse transform matrix Q = R ⊗ R A applied to block vectors, where R is the discrete cosine transform (DCT) matrix applied to block matrices [12]; 4. a trainable compressed sensing layer with P O nodes, O ≪ 1;…”
Section: B Class-specific Neural Network For Block Compressed Sensingmentioning
confidence: 99%