2015
DOI: 10.1049/iet-spr.2014.0188
|View full text |Cite
|
Sign up to set email alerts
|

Derivation and analysis of incremental augmented complex least mean square algorithm

Abstract: In this paper the authors propose an adaptive estimation algorithm for in-network processing of complex signals over distributed networks. In the proposed algorithm, as the incremental augmented complex least mean square (IAC-LMS) algorithm, nodes of the network are allowed to collaborate via incremental cooperation mode to exploit the spatial dimension; while at the same time are equipped with LMS learning rules to endow the network with adaptation. The authors have extracted closed-form expressions that show… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
1
1

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
2
2

Relationship

2
6

Authors

Journals

citations
Cited by 31 publications
(25 citation statements)
references
References 35 publications
(66 reference statements)
0
23
1
1
Order By: Relevance
“…Hence, cooperation should be limited to secondary users that pertain to a similar cluster. In distributed learning problems [30][31][32][33][34][35], nodes of network know their neighbourhoods but they do not know which subsets of their neighbours are interested into the same cluster. In order to derive a method for solving multitask learning problem, we define the N × N optimal neighbourhood matrix A o , where the entry a lk o will be set to a non-zero number if secondary user k discovers that its adjacent secondary user l belongs to the same cluster…”
Section: Multitask Spectrum Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…Hence, cooperation should be limited to secondary users that pertain to a similar cluster. In distributed learning problems [30][31][32][33][34][35], nodes of network know their neighbourhoods but they do not know which subsets of their neighbours are interested into the same cluster. In order to derive a method for solving multitask learning problem, we define the N × N optimal neighbourhood matrix A o , where the entry a lk o will be set to a non-zero number if secondary user k discovers that its adjacent secondary user l belongs to the same cluster…”
Section: Multitask Spectrum Learningmentioning
confidence: 99%
“…In this section, we investigate the performance of the proposed algorithm (30) and (31) with correntropy cooperation policy (39) for multitask learning problem that was summarised in Table 1. For initial iterations (i < T s ) the algorithm performs in a noncooperative manner, and thus each secondary user k of the network tracks its optimum vector w k o without any cooperation…”
Section: Performance Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…The statistics in the LMS are estimated continuously, therefore it is an adaptive filter belonging to the group of stochastic gradient methods. Extensive research is done towards optimization of the LMS algorithm by numerous researchers [9,18,19,20,21]. LMS is applied in diversified applications such as plant identification [8], noise cancellation [22], echo cancellation [23], ECG signal analysis [24], time series prediction [25] etc.…”
Section: Least Mean Squarementioning
confidence: 99%
“…As we can see from (9), correntropy can be interpreted as a generalized correlation function between two random variables X and Y which shows how similar two random variables are, within a small neighborhood determined by the kernel width σ.…”
Section: B Impulsive Noise Modelmentioning
confidence: 99%