ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2021
DOI: 10.1109/icassp39728.2021.9413415
|View full text |Cite
|
Sign up to set email alerts
|

Outlier-Robust Kernel Hierarchical-Optimization RLS on a Budget with Affine Constraints

Abstract: This paper introduces a non-parametric learning framework to combat outliers in online, multi-output, and nonlinear regression tasks. A hierarchical-optimization problem underpins the learning task: Search in a reproducing kernel Hilbert space (RKHS) for a function that minimizes a sample average ℓ -norm (1 ≤ ≤ 2) error loss defined on data contaminated by noise and outliers, under affine constraints defined as the set of minimizers of a quadratic loss on a finite number of faithful data devoid of noise and ou… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 39 publications
0
3
0
Order By: Relevance
“…Notwithstanding, the LS loss is notoriously sensitive to the presence of outliers [3], where outliers are defined as contaminating data that do not adhere to a nominal data-generation model, and are often viewed as random variables (RVs) with non-Gaussian heavy tailed distributions, e.g., α-stable ones [4,5]. To counter outliers in AdaFilt, non-LS losses, such as the p-norm (2 > p ∈ R ++ ) [6][7][8][9][10][11][12][13] and correntropy [14,15] have been studied (henceforth, R ++ will denote the set of all positive real numbers).…”
Section: A Motivation: Adaptive Filters Against Outliersmentioning
confidence: 99%
See 2 more Smart Citations
“…Notwithstanding, the LS loss is notoriously sensitive to the presence of outliers [3], where outliers are defined as contaminating data that do not adhere to a nominal data-generation model, and are often viewed as random variables (RVs) with non-Gaussian heavy tailed distributions, e.g., α-stable ones [4,5]. To counter outliers in AdaFilt, non-LS losses, such as the p-norm (2 > p ∈ R ++ ) [6][7][8][9][10][11][12][13] and correntropy [14,15] have been studied (henceforth, R ++ will denote the set of all positive real numbers).…”
Section: A Motivation: Adaptive Filters Against Outliersmentioning
confidence: 99%
“…Proof: See Appendix A. More variations of ( 9) can be generated from (10) by tuning the loss functions L, R appropriately. For example, robust B-Map designs against outliers in sampling can be obtained by letting the ℓ 1 -norm take the place of the quadratic one in (12a) and (14a).…”
Section: New Bellman Mappings In Rkhssmentioning
confidence: 99%
See 1 more Smart Citation