ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2019
DOI: 10.1109/icassp.2019.8682465
|View full text |Cite
|
Sign up to set email alerts
|

Block-randomized Stochastic Proximal Gradient for Constrained Low-rank Tensor Factorization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 16 publications
0
3
0
Order By: Relevance
“…Part of the work was submitted to ICASSP 2019 [35]. In this new version, we have additionally included detailed convergence proofs and the new adaptive stepsize based algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…Part of the work was submitted to ICASSP 2019 [35]. In this new version, we have additionally included detailed convergence proofs and the new adaptive stepsize based algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…Low-rankness is an important prior of the underlying HSI and has been widely used in image processing [47]- [56]. NGMeet unfolds the original HSI along the spectral dimension as a matrix and optimizes the matrix rank while the unfolding operation breaks the spatial structure.…”
Section: B Proposed Gnlr Model and Algorithmmentioning
confidence: 99%
“…An alternate approach to speeding up CP computations is by reducing the tensor size either via sampling or compression. A large body of work exists for randomized tensor methods [5,11,43,56] which are recently being extended to the constrained problem [12,13]. The second approach is to first compress the tensor using a different decomposition, like Tucker, and then compute CP on this reduced array.…”
Section: Related Workmentioning
confidence: 99%