2017
DOI: 10.1049/iet-spr.2016.0544
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of partial diffusion recursive least squares adaptation over noisy links

Abstract: Partial diffusion-based recursive least squares (PDRLS) is an effective method for reducing computational load and power consumption in adaptive network implementation. In this method, each node shares a part of its intermediate estimate vector with its neighbors at each iteration. PDRLS algorithm reduces the internode communications relative to the full-diffusion RLS algorithm. This selection of estimate entries becomes more appealing when the information fuse over noisy links. In this paper, we study the ste… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
13
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 22 publications
(13 citation statements)
references
References 41 publications
(126 reference statements)
0
13
0
Order By: Relevance
“…Remark 3. In order to better clarify (24), and discuss in more detail the extraction of MSD expression, we note that the kth block on the main diagonal of PX is E x k,i|ix T k,i|i . Multiplying PX by I k N M gives the kth block of PX .…”
Section: B Mean Performancementioning
confidence: 99%
See 1 more Smart Citation
“…Remark 3. In order to better clarify (24), and discuss in more detail the extraction of MSD expression, we note that the kth block on the main diagonal of PX is E x k,i|ix T k,i|i . Multiplying PX by I k N M gives the kth block of PX .…”
Section: B Mean Performancementioning
confidence: 99%
“…This issue motivates us to investigate the performance of PDKF algorithm in such scenarios. Some useful results dealing with the effects of noisy links on the performance of diffusion-based strategies behavior are presented in [14]- [16], [24], [25]. In this paper, we figure out how the noisy links affect deterioration of the network performance during the exchange of weight estimates.…”
Section: Introductionmentioning
confidence: 99%
“…Moreover, the convergence rate would be "optimal" when Pk,i is chosen to be the form in the paper. Furthermore, some existing methods can be used to reduce the communication complexity and to make the algorithm suitable for higher dimensional signals, for examples, eventdriven methods [55], partial diffusion methods [21], [26], [27], and compressed methods [56] and so on.…”
Section: Algorithm 1 Distributed Ls Algorithmmentioning
confidence: 99%
“…In [25], a diffusion bias-compensated LS algorithm was developed, and the closed-form expressions for the residual bias and the mean-square deviation of the estimates were provided under independence and stationarity assumptions. In addition, partial diffusion LS algorithms were proposed in [26], [27], and the performance results were established for ergodic signals [26] and independent signals [27]. Moreover, [28] proposed a reduced communication diffusion LS algorithm for distributed estimation over multi-agent, and [29] developed robust diffusion LS algorithms to mitigate the performance degradation in the presence of impulsive noise.…”
Section: Introductionmentioning
confidence: 99%
“…Some strategies including incremental strategies [23], consensus strategies [24], diffusion strategies [25], and combination of them [26] are proposed to construct the distributed algorithms. Based on these strategies, the performance analysis of the distributed estimation algorithms are investigated, for example, the consensus-based least mean squares (LMS) (e.g., [27] [28]), the diffusion stochastic gradient descent algorithm [29], the diffusion Kalman filter (e.g., [30] [31]), the diffusion least squares (LS) (e.g., [32]- [34]), the diffusion forgetting factor recursive least squares [35]. Most of the corresponding theoretical results are established by requiring the independency, stationarity or Gaussian assumptions for the regression vectors due to the mathematical difficulty in analyzing the product of random matrices.…”
mentioning
confidence: 99%