2017 IEEE 13th International Colloquium on Signal Processing &Amp; Its Applications (CSPA) 2017
DOI: 10.1109/cspa.2017.8064914
|View full text |Cite
|
Sign up to set email alerts
|

A robust variable step size fractional least mean square (RVSS-FLMS) algorithm

Abstract: In this paper, we propose an adaptive framework for the variable step size of the fractional least mean square (FLMS) algorithm. The proposed algorithm named the robust variable step size-FLMS (RVSS-FLMS), dynamically updates the step size of the FLMS to achieve high convergence rate with low steady state error. For the evaluation purpose, the problem of system identification is considered. The experiments clearly show that the proposed approach achieves better convergence rate compared to the FLMS and adaptiv… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
5
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2
1

Relationship

4
3

Authors

Journals

citations
Cited by 14 publications
(5 citation statements)
references
References 30 publications
0
5
0
Order By: Relevance
“…Because the LMS is dependent on the eigenvalue spread of the input correlation matrix, it suffers from slow convergence. Several adaptive strategies, such as normalized LMS (NLMS), computed adaptive learning rates, a chaotic teaching-learning based optimization, variable power fractional LMS algorithm, and variable step-size LMS algorithm, have been presented in the literature to address this issue, [9,10,11,12,13,14].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Because the LMS is dependent on the eigenvalue spread of the input correlation matrix, it suffers from slow convergence. Several adaptive strategies, such as normalized LMS (NLMS), computed adaptive learning rates, a chaotic teaching-learning based optimization, variable power fractional LMS algorithm, and variable step-size LMS algorithm, have been presented in the literature to address this issue, [9,10,11,12,13,14].…”
Section: Introductionmentioning
confidence: 99%
“…Beside these variants, various definitions of gradient have also been used to derive improved LMS algorithms; for instance in [18], a robust variable step size fractional least mean square (RVSS-FLMS) based on fractional-order-calculus (FOC), is designed. The algorithm is derived using Riemann-Liouville fractional derivative for high convergence performance.…”
Section: Introductionmentioning
confidence: 99%
“…However, some situations demand the utilization of nonlinear solutions [7], [8]. In [9], [10], fractional least mean square was utilized to propose an adaptive framework with variable power. The proposed method was applied on channel equalization and plant identification problems and it is shown that the algorithm adopts the fractional power.…”
Section: Introductionmentioning
confidence: 99%
“…Using fractional derivatives, we can have access to the additional information compared to the conventional derivative that only gives a tangent at a point for the given function [42]. Fractional gradient or fractional-order calculus (FoC) has been successfully used in many research applications including signal processing [3,53], control systems [2,12,13], bioengineering [47,61], time series prediction [36], adaptive filtering [40,41], robotics [17,44], communication [37], and electronics [43,55]. Motivated by the information gain that fractional derivative has to offer, we propose a novel learning rule for the RBF neural network by forming a convex combination of the conventional and the fractional gradients of the cost function.…”
Section: Introductionmentioning
confidence: 99%