2011
DOI: 10.1109/tcsi.2011.2142830
|View full text |Cite
|
Sign up to set email alerts
|

Frequency-Selective Noise-Compensated Autoregressive Estimation

Abstract: This paper presents a novel method for noise-compensated autoregressive estimation founded on the maximum-likelihood of the spectral samples. This framework yields a nonlinear optimization problem that can be revamped as a reweighted leastsquare problem. The resulting spectral weighting function turns out to be the square of the Wiener filter, this meaning that spectral regions with higher signal-to-noise ratio are more relevant in the estimation. Furthermore, this frequency-selective scenario allows us to int… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(10 citation statements)
references
References 29 publications
(100 reference statements)
0
10
0
Order By: Relevance
“…While preliminary results are encouraging, a formal criterion for the estimation of the stochastic energy and the CGC weights is necessary. The author has recently explored the noise-compensated estimation of an autoregressive model [11] with the maximum likelihood criterion, which could be extended with success to the scenario of concern in the present paper.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…While preliminary results are encouraging, a formal criterion for the estimation of the stochastic energy and the CGC weights is necessary. The author has recently explored the noise-compensated estimation of an autoregressive model [11] with the maximum likelihood criterion, which could be extended with success to the scenario of concern in the present paper.…”
Section: Discussionmentioning
confidence: 99%
“…In summary, the proposed approach can be treated as an adaptive CGC dictionary in which the search for the N best dictionary components g n (t) is accomplished with the optimization problems (10) and (11). The three problems (10)-(12) solved sequentially represent the basic iteration of the proposed method for CGC signal decomposition:…”
Section: Iterative Algorithm Summarymentioning
confidence: 99%
“…The probability density function of each sample can be thus written as (9) Note that, as (7) and (4) show, holds the parametric dependance with the AR parameters . The log-likelihood of the set , present in (8), is thus (10) The gradient of the log-likelihood for the th sample is equal to the following expression: (11) where corresponds to the Wiener filter (12) and (13) The objective of this section is to deduce the asymptotic FIM, which is in [6] defined as follows: (14) where (15) Based on the expansion (10), and given that the expected value of the information gradient (11) vanishes, that is , the previous term (15) can be expanded and simplified to (16) By substituting the information gradient (11) into (16), this term becomes (17) The complex Gaussian random variable fulfills (18) which allows us to simplify (17) to (19) On the other hand, the factor can be expanded to (20)…”
Section: Asymptotic Fisher Informationmentioning
confidence: 99%
“…In the most relevant works published in the last decade [8]- [11] (among others, not included here for the sake of brevity), the performance evaluation is carried out with a comparison analysis of previous methods. Although uncommon, in other works [16] the use of the noiseless bound (28) is suggested as a valid lower bound to this problem.…”
Section: Asymptotic Cramér-rao Boundmentioning
confidence: 99%
See 1 more Smart Citation