2021
DOI: 10.1111/1365-2478.13163
|View full text |Cite
|
Sign up to set email alerts
|

Low‐rank seismic data reconstruction and denoising by CUR matrix decompositions

Abstract: Low-rank reconstruction methods assume that noiseless and complete seismic data can be represented as low-rank matrices or tensors. Therefore, denoising and recovery of missing traces require a reduced-rank approximation of the data matrix/tensor. To calculate such approximation, we explore the CUR matrix decompositions, which use actual columns and rows of the data matrix, instead of the costly singular vectors derived from singular value decomposition. By allowing oversampling columns and rows, CUR decomposi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 70 publications
0
4
0
Order By: Relevance
“…The often-neglected role of the rank, for instance, should be further investigated towards flexibility. Additionally, it is worth exploring less-costly alternatives to SVD, paying special attention to those not demanding the exact value of the rank (Carozzi and Sacchi, 2019;Cavalcante and Porsani, 2022).…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…The often-neglected role of the rank, for instance, should be further investigated towards flexibility. Additionally, it is worth exploring less-costly alternatives to SVD, paying special attention to those not demanding the exact value of the rank (Carozzi and Sacchi, 2019;Cavalcante and Porsani, 2022).…”
Section: Discussionmentioning
confidence: 99%
“…It seems that SVD-based approaches lack flexibility regarding the choice of the appropriate rank: it should be the least possible value. This fact and the computational cost have motivated the search for alternatives to the SVD, such as randomized-SVD (Oropeza and Sacchi, 2011), Lanczos bidiagonalization (Gao et al, 2013), and CUR decompositions (Cavalcante and Porsani, 2022).…”
Section: More Numerical Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…Both methods have greatly enhanced the reconstruction performance and quality. Cavalcante and Porsani (2022) proposed the CUR matrix decomposition method to replace SVD for higher order tensors. The selection of the rank parameter of each tensor dimension has a significant impact on the reconstruction performance and time needed for this method.…”
Section: Introductionmentioning
confidence: 99%
“…The most traditional is the singular value decomposition (SVD). However, rank‐reduction can be more effective using Lanczos bidiagonalization, QR decomposition, randomized SVD (Liberty et al., 2007; Oropeza & Sacchi, 2010; Rokhlin et al., 2010), randomized QR decomposition (Chiron et al., 2014; Carozzi & Sacchi, 2017) and CUR decomposition (Cavalcante & Porsani, 2022; Manenti & Sacchi, 2022) as other engines for rank reduction.…”
Section: Introductionmentioning
confidence: 99%