2021
DOI: 10.1109/tsp.2021.3065173
|View full text |Cite
|
Sign up to set email alerts
|

General Cauchy Conjugate Gradient Algorithms Based on Multiple Random Fourier Features

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 18 publications
(6 citation statements)
references
References 48 publications
0
6
0
Order By: Relevance
“…en, ( 29) can be further determined by (36), as shown at the top of next page. By (36), thus the sufficient convergence condition on mean stability is given by…”
Section: Convergence Condition On Mean Stability Of the Kllad Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…en, ( 29) can be further determined by (36), as shown at the top of next page. By (36), thus the sufficient convergence condition on mean stability is given by…”
Section: Convergence Condition On Mean Stability Of the Kllad Algorithmmentioning
confidence: 99%
“…More recently, the logarithmic hyperbolic cosine-based adaptive filter (LHCAF) was proposed in [35] to address the issue of instability of its prototype algorithm, and the transient and steady-state analyses were also provided. Subsequently, the authors of [36] proposed the multiple random Fourier features Cauchy-loss conjugate gradient (MRFGCG) algorithm which has better performance than the classical KAF algorithms in terms of computational complexity and filtering accuracy. erefore, the cost functions adopting the frameworks of fractional order statistics of error or the distinct types of error measures are able to provide effective ways to reveal the robust performance against impulsive noises.…”
Section: Introductionmentioning
confidence: 99%
“…More importantly, due to its simplicity, the VRFF method can be readily applied to other RFF-based algorithms. Secondly, the adaptation steps (22) and (23) no longer guaranty that the {ω m,n } are driven by any Gaussian distribution N (0 D , ξ −2 I D ) as the algorithm progresses. This does not allow us to establish a correspondence between the {ω m,n } and the bandwidth ξ n of a Gaussian kernel.…”
Section: Adaptive Random Fourier Features Gklmsmentioning
confidence: 99%
“…The RFF principle was used with the kernel conjugate gradient algorithm [21]. The Cauchy-loss conjugate gradient method based on multiple RFF was proposed in [22] to improve robustness and reduce computational cost in the presence of non-Gaussian noises. Recently, several RFF kernel regression algorithms over graphs were proposed in [23], and their conditions for convergence in the mean and mean-square sense were also studied.…”
Section: Introductionmentioning
confidence: 99%
“…Compared with the Nyström method, random Fourier features mapping is more appealing for non-stationary environments in online applications [21]. Therefore, the random Fourier features mapping is applied into MKAFs to generate the multiple random features (MRF) method [22,23].…”
Section: Introductionmentioning
confidence: 99%