2016
DOI: 10.1109/tsp.2016.2546225
|View full text |Cite
|
Sign up to set email alerts
|

Online Censoring for Large-Scale Regressions with Application to Streaming Big Data

Abstract: On par with data-intensive applications, the sheer size of modern linear regression problems creates an ever-growing demand for efficient solvers. Fortunately, a significant percentage of the data accrued can be omitted while maintaining a certain quality of statistical inference with an affordable computational budget. This work introduces means of identifying and omitting less informative observations in an online and data-adaptive fashion. Given streaming data, the related maximum-likelihood estimator is se… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0
11

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 61 publications
(54 citation statements)
references
References 31 publications
(70 reference statements)
0
28
0
11
Order By: Relevance
“…The SP and online learning techniques for big data analytics described in [110] provides a good research direction for future work. Based on this, in [117], the authors developed online algorithms for large-scale regressions with application to streaming big data. In addition, Slavakis and Giannakis further used accelerated stochastic approximation method with online and modular learning algorithms to deal with a large class of nonconvex data models [118].…”
Section: The Latest Research Progressmentioning
confidence: 99%
“…The SP and online learning techniques for big data analytics described in [110] provides a good research direction for future work. Based on this, in [117], the authors developed online algorithms for large-scale regressions with application to streaming big data. In addition, Slavakis and Giannakis further used accelerated stochastic approximation method with online and modular learning algorithms to deal with a large class of nonconvex data models [118].…”
Section: The Latest Research Progressmentioning
confidence: 99%
“…As time evolves, all local estimates consent on the centralized RLS solution. This paper builds on both [4] and [16] by developing censoring-based decentralized RLS algorithms, thus catering to efficient online linear regression over largescale networks.…”
Section: A Related Workmentioning
confidence: 99%
“…Dividing numerator and denominator byN , (6) A threshold γ satisfying this necessary condition can be found using the Robbins-Monro iteration [14], which iteratively sets γ n+1 = γ n + µ n max(∆n − γ n , 0) − γ n Tc Tu (10) where µ n > 0 represents the step size. if∆ n|k ≥ γ n then 6: k ← k + 1 7:ŵ k = Fw(dn,ŵ k−1 , S k−1 ) 8: …”
Section: Threshold Selectionmentioning
confidence: 99%
“…For example, in [6,7] observations with low innovations are discarded to reduce computational complexity of estimating linear regression coefficients. Similar principles are exploited in [6][7][8][9][10], where non-informative observations are censored to minimize the communication overhead in bandwidth-constrained sensor networks.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation