2018
DOI: 10.1016/j.dsp.2018.07.012
|View full text |Cite
|
Sign up to set email alerts
|

Boosted adaptive filters

Abstract: We investigate boosted online regression and propose a novel family of regression algorithms with strong theoretical bounds. In addition, we implement several variants of the proposed generic algorithm. We specifically provide theoretical bounds for the performance of our proposed algorithms that hold in a strong mathematical sense. We achieve guaranteed performance improvement over the conventional online regression methods without any statistical assumptions on the desired data or feature vectors. We demonst… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 58 publications
0
3
0
Order By: Relevance
“…As a remedy, different models are used together to capture different patterns in real-life data [21], the most common approach being ensemble or mixture models. Ensemble models combine the predictions of several models and are heavily investigated in different fields such as signal processing [22,14], speech recognition [23], computational intelligence, statistics, and machine learning [15,17]. There are several methods for designing ensembles; one can train individual models independently from each other and then combine them [14], or train individual models in a sequential manner where each model focuses on correcting the errors made by the previous ones [16].…”
Section: Prior Artmentioning
confidence: 99%
“…As a remedy, different models are used together to capture different patterns in real-life data [21], the most common approach being ensemble or mixture models. Ensemble models combine the predictions of several models and are heavily investigated in different fields such as signal processing [22,14], speech recognition [23], computational intelligence, statistics, and machine learning [15,17]. There are several methods for designing ensembles; one can train individual models independently from each other and then combine them [14], or train individual models in a sequential manner where each model focuses on correcting the errors made by the previous ones [16].…”
Section: Prior Artmentioning
confidence: 99%
“…For evaluating the proposed hybrid feature selection with random forest, the regressor model is instantiated with 550 estimator decision trees and 40 random states. Boosting algorithms are a subclass of ensemble algorithms and one of the most widely used algorithms in data science [42], converting weak learners to strong learners. Gradient boosting [43] sequentially trains several models, and every new model consistently reduces the loss function of the entire procedure utilizing the gradient descendant process.…”
Section: Machine Learningmentioning
confidence: 99%
“…In many applications that use modern methods and algorithms for digital signal processing, adaptive digital filters (ADF) are widely used [3,4], including a digital filter and an adaptation system.…”
Section: Introductionmentioning
confidence: 99%