2009 IEEE International Conference on Acoustics, Speech and Signal Processing 2009
DOI: 10.1109/icassp.2009.4960320
|View full text |Cite
|
Sign up to set email alerts
|

Structured least squares with bounded data uncertainties

Abstract: In many signal processing applications the core problem reduces to a linear system of equations. Coefficient matrix uncertainties create a significant challenge in obtaining reliable solutions. In this paper, we present a novel formulation for solving a system of noise contaminated linear equations while preserving the structure of the coefficient matrix. The proposed method has advantages over the known Structured Total Least Squares (STLS) techniques in utilizing additional information about the uncertaintie… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 6 publications
(11 reference statements)
0
4
0
Order By: Relevance
“…Plugging-in this inequality into (71) as well as the strong convexity inequality ∆, g y * t − g z * 1 µ ∆ 2 2 , we obtain that…”
Section: E Proof Of Theoremmentioning
confidence: 99%
“…Plugging-in this inequality into (71) as well as the strong convexity inequality ∆, g y * t − g z * 1 µ ∆ 2 2 , we obtain that…”
Section: E Proof Of Theoremmentioning
confidence: 99%
“…If x is fixed then there exists many algorithms to solve for a sparse p [12]. Therefore a local optimum can be found using an alternating minimizations algorithm [13] where we chose Orthogonal Matching Pursuit (OMP) [14] in the intermediate step for its simplicity:…”
Section: Iii-a Alternating Minimizations Algorithm To Solve P0mentioning
confidence: 99%
“…In [24], it is shown that for high SNR the covariance matrix of the STLS estimator can be approximated by (5) If has a large condition number, the variance can be extremely large. It is usually noted in applications that at low SNR, the error variance is even larger than its approximation in (5) [25], [26].…”
Section: B Regularized-structured Total Least Squares Approachmentioning
confidence: 99%
“…Consider the single parameter equation below: (24) The corresponding structures are (25) Define the cost of given by (26) which corresponds to a constant multiple of the negative loglikelihood given for the observation where is a zero-mean Gaussian random variable. Fig.…”
Section: Analysis Of Estimator Performance In An Illustrative Examentioning
confidence: 99%